Nov 22 04:07:27 crc systemd[1]: Starting Kubernetes Kubelet... Nov 22 04:07:27 crc restorecon[4698]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:07:27 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 04:07:28 crc restorecon[4698]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 04:07:28 crc restorecon[4698]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Nov 22 04:07:29 crc kubenswrapper[4699]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 22 04:07:29 crc kubenswrapper[4699]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Nov 22 04:07:29 crc kubenswrapper[4699]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 22 04:07:29 crc kubenswrapper[4699]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 22 04:07:29 crc kubenswrapper[4699]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Nov 22 04:07:29 crc kubenswrapper[4699]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.169015 4699 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176281 4699 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176312 4699 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176317 4699 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176322 4699 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176326 4699 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176331 4699 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176335 4699 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176340 4699 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176345 4699 feature_gate.go:330] unrecognized feature gate: Example Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176350 4699 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176354 4699 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176359 4699 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176363 4699 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176368 4699 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176384 4699 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176390 4699 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176395 4699 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176400 4699 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176405 4699 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176410 4699 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176415 4699 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176420 4699 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176425 4699 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176445 4699 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176451 4699 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176455 4699 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176458 4699 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176462 4699 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176465 4699 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176469 4699 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176472 4699 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176476 4699 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176479 4699 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176482 4699 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176486 4699 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176490 4699 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176493 4699 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176497 4699 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176500 4699 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176507 4699 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176512 4699 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176516 4699 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176520 4699 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176523 4699 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176526 4699 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176531 4699 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176535 4699 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176539 4699 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176543 4699 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176546 4699 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176550 4699 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176553 4699 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176557 4699 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176561 4699 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176565 4699 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176568 4699 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176572 4699 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176575 4699 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176581 4699 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176584 4699 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176589 4699 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176593 4699 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176597 4699 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176601 4699 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176605 4699 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176609 4699 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176613 4699 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176620 4699 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176624 4699 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176628 4699 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.176632 4699 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176734 4699 flags.go:64] FLAG: --address="0.0.0.0" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176743 4699 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176753 4699 flags.go:64] FLAG: --anonymous-auth="true" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176759 4699 flags.go:64] FLAG: --application-metrics-count-limit="100" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176766 4699 flags.go:64] FLAG: --authentication-token-webhook="false" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176770 4699 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176776 4699 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176782 4699 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176787 4699 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176791 4699 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176796 4699 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176801 4699 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176806 4699 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176810 4699 flags.go:64] FLAG: --cgroup-root="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176814 4699 flags.go:64] FLAG: --cgroups-per-qos="true" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176819 4699 flags.go:64] FLAG: --client-ca-file="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176823 4699 flags.go:64] FLAG: --cloud-config="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176827 4699 flags.go:64] FLAG: --cloud-provider="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176831 4699 flags.go:64] FLAG: --cluster-dns="[]" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176837 4699 flags.go:64] FLAG: --cluster-domain="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176841 4699 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176846 4699 flags.go:64] FLAG: --config-dir="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176849 4699 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176854 4699 flags.go:64] FLAG: --container-log-max-files="5" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176860 4699 flags.go:64] FLAG: --container-log-max-size="10Mi" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176865 4699 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176869 4699 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176874 4699 flags.go:64] FLAG: --containerd-namespace="k8s.io" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176878 4699 flags.go:64] FLAG: --contention-profiling="false" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176883 4699 flags.go:64] FLAG: --cpu-cfs-quota="true" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176888 4699 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176894 4699 flags.go:64] FLAG: --cpu-manager-policy="none" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176899 4699 flags.go:64] FLAG: --cpu-manager-policy-options="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176906 4699 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176911 4699 flags.go:64] FLAG: --enable-controller-attach-detach="true" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176916 4699 flags.go:64] FLAG: --enable-debugging-handlers="true" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176920 4699 flags.go:64] FLAG: --enable-load-reader="false" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176924 4699 flags.go:64] FLAG: --enable-server="true" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176929 4699 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176936 4699 flags.go:64] FLAG: --event-burst="100" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176940 4699 flags.go:64] FLAG: --event-qps="50" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176944 4699 flags.go:64] FLAG: --event-storage-age-limit="default=0" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176949 4699 flags.go:64] FLAG: --event-storage-event-limit="default=0" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176955 4699 flags.go:64] FLAG: --eviction-hard="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176960 4699 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176964 4699 flags.go:64] FLAG: --eviction-minimum-reclaim="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176968 4699 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176973 4699 flags.go:64] FLAG: --eviction-soft="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176977 4699 flags.go:64] FLAG: --eviction-soft-grace-period="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176982 4699 flags.go:64] FLAG: --exit-on-lock-contention="false" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176986 4699 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176990 4699 flags.go:64] FLAG: --experimental-mounter-path="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176995 4699 flags.go:64] FLAG: --fail-cgroupv1="false" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.176999 4699 flags.go:64] FLAG: --fail-swap-on="true" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177003 4699 flags.go:64] FLAG: --feature-gates="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177009 4699 flags.go:64] FLAG: --file-check-frequency="20s" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177013 4699 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177017 4699 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177022 4699 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177027 4699 flags.go:64] FLAG: --healthz-port="10248" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177032 4699 flags.go:64] FLAG: --help="false" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177036 4699 flags.go:64] FLAG: --hostname-override="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177040 4699 flags.go:64] FLAG: --housekeeping-interval="10s" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177045 4699 flags.go:64] FLAG: --http-check-frequency="20s" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177049 4699 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177053 4699 flags.go:64] FLAG: --image-credential-provider-config="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177057 4699 flags.go:64] FLAG: --image-gc-high-threshold="85" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177061 4699 flags.go:64] FLAG: --image-gc-low-threshold="80" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177066 4699 flags.go:64] FLAG: --image-service-endpoint="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177070 4699 flags.go:64] FLAG: --kernel-memcg-notification="false" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177074 4699 flags.go:64] FLAG: --kube-api-burst="100" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177078 4699 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177082 4699 flags.go:64] FLAG: --kube-api-qps="50" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177086 4699 flags.go:64] FLAG: --kube-reserved="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177090 4699 flags.go:64] FLAG: --kube-reserved-cgroup="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177094 4699 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177098 4699 flags.go:64] FLAG: --kubelet-cgroups="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177102 4699 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177106 4699 flags.go:64] FLAG: --lock-file="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177110 4699 flags.go:64] FLAG: --log-cadvisor-usage="false" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177117 4699 flags.go:64] FLAG: --log-flush-frequency="5s" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177122 4699 flags.go:64] FLAG: --log-json-info-buffer-size="0" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177130 4699 flags.go:64] FLAG: --log-json-split-stream="false" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177134 4699 flags.go:64] FLAG: --log-text-info-buffer-size="0" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177139 4699 flags.go:64] FLAG: --log-text-split-stream="false" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177143 4699 flags.go:64] FLAG: --logging-format="text" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177147 4699 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177152 4699 flags.go:64] FLAG: --make-iptables-util-chains="true" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177156 4699 flags.go:64] FLAG: --manifest-url="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177160 4699 flags.go:64] FLAG: --manifest-url-header="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177167 4699 flags.go:64] FLAG: --max-housekeeping-interval="15s" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177171 4699 flags.go:64] FLAG: --max-open-files="1000000" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177177 4699 flags.go:64] FLAG: --max-pods="110" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177182 4699 flags.go:64] FLAG: --maximum-dead-containers="-1" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177186 4699 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177190 4699 flags.go:64] FLAG: --memory-manager-policy="None" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177194 4699 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177199 4699 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177203 4699 flags.go:64] FLAG: --node-ip="192.168.126.11" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177207 4699 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177219 4699 flags.go:64] FLAG: --node-status-max-images="50" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177223 4699 flags.go:64] FLAG: --node-status-update-frequency="10s" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177227 4699 flags.go:64] FLAG: --oom-score-adj="-999" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177232 4699 flags.go:64] FLAG: --pod-cidr="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177236 4699 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177245 4699 flags.go:64] FLAG: --pod-manifest-path="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177249 4699 flags.go:64] FLAG: --pod-max-pids="-1" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177253 4699 flags.go:64] FLAG: --pods-per-core="0" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177257 4699 flags.go:64] FLAG: --port="10250" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177261 4699 flags.go:64] FLAG: --protect-kernel-defaults="false" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177266 4699 flags.go:64] FLAG: --provider-id="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177270 4699 flags.go:64] FLAG: --qos-reserved="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177274 4699 flags.go:64] FLAG: --read-only-port="10255" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177278 4699 flags.go:64] FLAG: --register-node="true" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177282 4699 flags.go:64] FLAG: --register-schedulable="true" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177287 4699 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177295 4699 flags.go:64] FLAG: --registry-burst="10" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177299 4699 flags.go:64] FLAG: --registry-qps="5" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177303 4699 flags.go:64] FLAG: --reserved-cpus="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177307 4699 flags.go:64] FLAG: --reserved-memory="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177313 4699 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177317 4699 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177322 4699 flags.go:64] FLAG: --rotate-certificates="false" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177326 4699 flags.go:64] FLAG: --rotate-server-certificates="false" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177330 4699 flags.go:64] FLAG: --runonce="false" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177334 4699 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177339 4699 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177343 4699 flags.go:64] FLAG: --seccomp-default="false" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177347 4699 flags.go:64] FLAG: --serialize-image-pulls="true" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177352 4699 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177356 4699 flags.go:64] FLAG: --storage-driver-db="cadvisor" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177360 4699 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177371 4699 flags.go:64] FLAG: --storage-driver-password="root" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177375 4699 flags.go:64] FLAG: --storage-driver-secure="false" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177380 4699 flags.go:64] FLAG: --storage-driver-table="stats" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177384 4699 flags.go:64] FLAG: --storage-driver-user="root" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177388 4699 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177392 4699 flags.go:64] FLAG: --sync-frequency="1m0s" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177397 4699 flags.go:64] FLAG: --system-cgroups="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177401 4699 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177408 4699 flags.go:64] FLAG: --system-reserved-cgroup="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177413 4699 flags.go:64] FLAG: --tls-cert-file="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177417 4699 flags.go:64] FLAG: --tls-cipher-suites="[]" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177424 4699 flags.go:64] FLAG: --tls-min-version="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177428 4699 flags.go:64] FLAG: --tls-private-key-file="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177454 4699 flags.go:64] FLAG: --topology-manager-policy="none" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177463 4699 flags.go:64] FLAG: --topology-manager-policy-options="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177468 4699 flags.go:64] FLAG: --topology-manager-scope="container" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177473 4699 flags.go:64] FLAG: --v="2" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177482 4699 flags.go:64] FLAG: --version="false" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177490 4699 flags.go:64] FLAG: --vmodule="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177497 4699 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.177501 4699 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177639 4699 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177650 4699 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177656 4699 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177661 4699 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177666 4699 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177671 4699 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177675 4699 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177679 4699 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177684 4699 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177688 4699 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177692 4699 feature_gate.go:330] unrecognized feature gate: Example Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177699 4699 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177705 4699 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177709 4699 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177713 4699 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177718 4699 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177722 4699 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177726 4699 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177730 4699 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177734 4699 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177739 4699 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177743 4699 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177751 4699 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177756 4699 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177761 4699 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177766 4699 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177770 4699 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177774 4699 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177780 4699 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177785 4699 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177790 4699 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177794 4699 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177799 4699 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177803 4699 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177807 4699 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177812 4699 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177816 4699 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177820 4699 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177824 4699 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177828 4699 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177832 4699 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177836 4699 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177841 4699 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177846 4699 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177850 4699 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177856 4699 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177861 4699 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177866 4699 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177870 4699 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177875 4699 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177879 4699 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177884 4699 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177888 4699 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177892 4699 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177899 4699 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177904 4699 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177908 4699 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177912 4699 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177917 4699 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177922 4699 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177928 4699 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177933 4699 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177938 4699 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177943 4699 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177948 4699 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177952 4699 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177957 4699 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177962 4699 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177965 4699 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177969 4699 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.177973 4699 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.178884 4699 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.189710 4699 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.189739 4699 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.189800 4699 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.189811 4699 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.189817 4699 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.189823 4699 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.189827 4699 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.189831 4699 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.189835 4699 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.189838 4699 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.189843 4699 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.189846 4699 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.189850 4699 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.189853 4699 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.189857 4699 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.189860 4699 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.189865 4699 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.189870 4699 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.189873 4699 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.189877 4699 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.189881 4699 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.189885 4699 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.189890 4699 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.189893 4699 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.189897 4699 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.189901 4699 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.189905 4699 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.189909 4699 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.189913 4699 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.189916 4699 feature_gate.go:330] unrecognized feature gate: Example Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.189920 4699 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.189923 4699 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.189927 4699 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.189930 4699 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.189933 4699 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.189937 4699 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.189941 4699 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.189945 4699 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.189950 4699 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.189956 4699 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.189960 4699 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.189965 4699 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.189969 4699 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.189976 4699 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.189981 4699 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.189985 4699 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.189990 4699 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.189994 4699 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.189998 4699 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190002 4699 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190008 4699 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190013 4699 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190018 4699 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190023 4699 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190027 4699 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190030 4699 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190034 4699 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190037 4699 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190040 4699 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190044 4699 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190047 4699 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190051 4699 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190055 4699 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190058 4699 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190061 4699 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190065 4699 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190068 4699 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190072 4699 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190075 4699 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190079 4699 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190082 4699 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190085 4699 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190090 4699 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.190096 4699 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190215 4699 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190223 4699 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190228 4699 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190233 4699 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190237 4699 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190241 4699 feature_gate.go:330] unrecognized feature gate: Example Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190245 4699 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190248 4699 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190252 4699 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190256 4699 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190260 4699 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190263 4699 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190267 4699 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190271 4699 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190274 4699 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190277 4699 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190281 4699 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190285 4699 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190291 4699 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190295 4699 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190299 4699 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190303 4699 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190307 4699 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190310 4699 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190314 4699 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190317 4699 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190322 4699 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190326 4699 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190330 4699 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190334 4699 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190338 4699 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190342 4699 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190346 4699 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190349 4699 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190354 4699 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190358 4699 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190362 4699 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190365 4699 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190369 4699 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190372 4699 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190376 4699 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190379 4699 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190382 4699 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190386 4699 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190391 4699 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190395 4699 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190399 4699 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190403 4699 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190407 4699 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190411 4699 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190415 4699 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190419 4699 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190423 4699 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190426 4699 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190456 4699 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190461 4699 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190465 4699 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190469 4699 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190473 4699 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190476 4699 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190480 4699 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190483 4699 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190487 4699 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190490 4699 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190494 4699 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190498 4699 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190501 4699 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190505 4699 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190509 4699 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190513 4699 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.190518 4699 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.190526 4699 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.191602 4699 server.go:940] "Client rotation is on, will bootstrap in background" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.199363 4699 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.199512 4699 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.201297 4699 server.go:997] "Starting client certificate rotation" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.201329 4699 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.201527 4699 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-05 23:39:38.59187224 +0000 UTC Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.201646 4699 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1075h32m9.390229529s for next certificate rotation Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.262380 4699 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.266112 4699 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.284292 4699 log.go:25] "Validated CRI v1 runtime API" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.319665 4699 log.go:25] "Validated CRI v1 image API" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.321595 4699 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.328383 4699 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-11-22-04-02-45-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.328458 4699 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:41 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.347716 4699 manager.go:217] Machine: {Timestamp:2025-11-22 04:07:29.344777259 +0000 UTC m=+0.687398466 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:76c96961-7d99-459e-9731-5ae805318244 BootID:4852b328-c4f8-4280-9881-83927c94bf9a Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:41 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:ac:4b:f6 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:ac:4b:f6 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:38:13:44 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:88:7d:3b Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:29:5e:0f Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:74:9e:18 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:e6:e8:f0:c0:ab:2a Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:e2:51:64:c5:56:7c Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.347976 4699 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.348204 4699 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.348674 4699 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.348846 4699 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.348890 4699 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.351579 4699 topology_manager.go:138] "Creating topology manager with none policy" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.351603 4699 container_manager_linux.go:303] "Creating device plugin manager" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.352152 4699 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.352198 4699 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.354072 4699 state_mem.go:36] "Initialized new in-memory state store" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.354187 4699 server.go:1245] "Using root directory" path="/var/lib/kubelet" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.358910 4699 kubelet.go:418] "Attempting to sync node with API server" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.358938 4699 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.358957 4699 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.358975 4699 kubelet.go:324] "Adding apiserver pod source" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.358989 4699 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.363988 4699 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.365281 4699 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.366808 4699 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.369543 4699 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.369642 4699 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Nov 22 04:07:29 crc kubenswrapper[4699]: E1122 04:07:29.369826 4699 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.136:6443: connect: connection refused" logger="UnhandledError" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.369737 4699 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.369946 4699 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.369966 4699 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.369978 4699 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.369998 4699 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.370012 4699 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.370025 4699 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.370042 4699 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.370065 4699 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.370078 4699 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.370096 4699 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.370110 4699 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Nov 22 04:07:29 crc kubenswrapper[4699]: E1122 04:07:29.370265 4699 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.136:6443: connect: connection refused" logger="UnhandledError" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.372924 4699 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.374162 4699 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.374347 4699 server.go:1280] "Started kubelet" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.375043 4699 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.375869 4699 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Nov 22 04:07:29 crc systemd[1]: Started Kubernetes Kubelet. Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.383814 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.384475 4699 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.384529 4699 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.384563 4699 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 11:16:32.707401171 +0000 UTC Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.384612 4699 server.go:460] "Adding debug handlers to kubelet server" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.384997 4699 volume_manager.go:287] "The desired_state_of_world populator starts" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.385019 4699 volume_manager.go:289] "Starting Kubelet Volume Manager" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.385150 4699 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Nov 22 04:07:29 crc kubenswrapper[4699]: E1122 04:07:29.385338 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.384614 4699 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 415h9m3.322790024s for next certificate rotation Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.385954 4699 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Nov 22 04:07:29 crc kubenswrapper[4699]: E1122 04:07:29.386612 4699 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.136:6443: connect: connection refused" logger="UnhandledError" Nov 22 04:07:29 crc kubenswrapper[4699]: E1122 04:07:29.385694 4699 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.136:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187a389c9d4c21ba default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-22 04:07:29.37403641 +0000 UTC m=+0.716657597,LastTimestamp:2025-11-22 04:07:29.37403641 +0000 UTC m=+0.716657597,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 22 04:07:29 crc kubenswrapper[4699]: E1122 04:07:29.387966 4699 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" interval="200ms" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.389491 4699 factory.go:55] Registering systemd factory Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.389918 4699 factory.go:221] Registration of the systemd container factory successfully Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.391499 4699 factory.go:153] Registering CRI-O factory Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.391520 4699 factory.go:221] Registration of the crio container factory successfully Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.391673 4699 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.391860 4699 factory.go:103] Registering Raw factory Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.391895 4699 manager.go:1196] Started watching for new ooms in manager Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.392797 4699 manager.go:319] Starting recovery of all containers Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.394349 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.394402 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.394419 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.394449 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.394465 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.394477 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.394498 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.394509 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.394545 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.394563 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.394581 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.394598 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.394614 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.394630 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.394646 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.394662 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.394683 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.394698 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.394710 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.394725 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.394781 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.394796 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.394811 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.394829 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.394845 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.394860 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.394880 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.394901 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.394920 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.394935 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.394951 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.394966 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.394991 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395013 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395031 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395047 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395064 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395081 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395097 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395114 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395130 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395146 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395163 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395182 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395200 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395217 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395233 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395251 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395272 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395288 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395307 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395323 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395345 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395364 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395386 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395403 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395418 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395454 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395470 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395486 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395504 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395520 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395536 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395590 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395610 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395626 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395645 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395663 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395681 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395698 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395717 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395734 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395752 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395771 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395788 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395803 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395820 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395838 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395854 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395869 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395887 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395904 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395920 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.395936 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.398256 4699 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.398321 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.398342 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.398357 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.398381 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.398411 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.398454 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.398578 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.398595 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.398609 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.398626 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.398640 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.398667 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.398680 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.398709 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.398722 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.398734 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.398749 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.398770 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.398792 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.398827 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.398976 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.399007 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.399028 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.399045 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.399116 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.399131 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.399159 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.399190 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.399203 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.399215 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.399986 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.400066 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.400083 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.400097 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.400115 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.400142 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.400154 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.400172 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.400191 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.400233 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.400262 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.400274 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.401710 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.401816 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.401868 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.401918 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.401986 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.402099 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.402159 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.402192 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.402249 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.402298 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.402331 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.402373 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.402484 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.402570 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.402679 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.402707 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.402796 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.402826 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.402899 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.402941 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.402999 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.403022 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.403052 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.403135 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.403192 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.403215 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.403234 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.403269 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.403305 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.403352 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.403429 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.403491 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.403567 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.403588 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.403707 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.403747 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.403770 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.403798 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.403826 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.403897 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.403945 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.403968 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.403996 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.404018 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.404039 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.404074 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.404121 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.404148 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.404220 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.404263 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.404291 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.404426 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.405192 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.405248 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.406352 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.406492 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.406765 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.406904 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.406927 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.406943 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.407070 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.407089 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.407136 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.407149 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.407184 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.407206 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.407219 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.407231 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.407245 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.407256 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.407272 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.407285 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.409548 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.409658 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.409728 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.409752 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.409772 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.409790 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.409805 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.409826 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.409848 4699 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.409866 4699 reconstruct.go:97] "Volume reconstruction finished" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.409876 4699 reconciler.go:26] "Reconciler: start to sync state" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.420578 4699 manager.go:324] Recovery completed Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.432468 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.434997 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.435033 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.435042 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.436243 4699 cpu_manager.go:225] "Starting CPU manager" policy="none" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.436257 4699 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.436276 4699 state_mem.go:36] "Initialized new in-memory state store" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.444645 4699 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.446426 4699 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.446524 4699 status_manager.go:217] "Starting to sync pod status with apiserver" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.446572 4699 kubelet.go:2335] "Starting kubelet main sync loop" Nov 22 04:07:29 crc kubenswrapper[4699]: E1122 04:07:29.446645 4699 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.447718 4699 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Nov 22 04:07:29 crc kubenswrapper[4699]: E1122 04:07:29.447867 4699 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.136:6443: connect: connection refused" logger="UnhandledError" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.457909 4699 policy_none.go:49] "None policy: Start" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.459156 4699 memory_manager.go:170] "Starting memorymanager" policy="None" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.459256 4699 state_mem.go:35] "Initializing new in-memory state store" Nov 22 04:07:29 crc kubenswrapper[4699]: E1122 04:07:29.485532 4699 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.516037 4699 manager.go:334] "Starting Device Plugin manager" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.516113 4699 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.516132 4699 server.go:79] "Starting device plugin registration server" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.516827 4699 eviction_manager.go:189] "Eviction manager: starting control loop" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.516847 4699 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.517074 4699 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.517371 4699 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.517471 4699 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Nov 22 04:07:29 crc kubenswrapper[4699]: E1122 04:07:29.523679 4699 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.546938 4699 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc"] Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.547113 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.548378 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.548411 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.548422 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.548601 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.548907 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.548980 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.549395 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.549481 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.549502 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.549650 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.549884 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.549920 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.550231 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.550272 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.550284 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.550631 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.550678 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.550689 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.550787 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.550817 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.550832 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.550840 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.551047 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.551169 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.551536 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.551578 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.551591 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.551784 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.551880 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.551923 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.552496 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.552548 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.552564 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.552920 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.552952 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.552963 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.553196 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.553228 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.554395 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.554453 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.554465 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.554636 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.554659 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.554668 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:29 crc kubenswrapper[4699]: E1122 04:07:29.589030 4699 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" interval="400ms" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.613476 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.613512 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.613537 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.613554 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.613572 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.613587 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.613602 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.613688 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.613796 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.613854 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.613891 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.613927 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.613960 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.614038 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.614096 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.617788 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.619080 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.619120 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.619132 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.619156 4699 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 22 04:07:29 crc kubenswrapper[4699]: E1122 04:07:29.619673 4699 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.136:6443: connect: connection refused" node="crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.715922 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.715988 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.716021 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.716051 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.716082 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.716112 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.716139 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.716174 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.716201 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.716247 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.716208 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.716289 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.716283 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.716309 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.716317 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.716172 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.716327 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.716285 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.716345 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.716285 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.716461 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.716498 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.716469 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.716581 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.716630 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.716702 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.716713 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.716747 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.716800 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.716893 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.820263 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.821845 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.821910 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.821934 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.821971 4699 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 22 04:07:29 crc kubenswrapper[4699]: E1122 04:07:29.822472 4699 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.136:6443: connect: connection refused" node="crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.883114 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.888299 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.913334 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.930494 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: I1122 04:07:29.936705 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.939401 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-cf55632a0d47d58157666641ed0955f90ded0a08ee2037134141657cb379e8c7 WatchSource:0}: Error finding container cf55632a0d47d58157666641ed0955f90ded0a08ee2037134141657cb379e8c7: Status 404 returned error can't find the container with id cf55632a0d47d58157666641ed0955f90ded0a08ee2037134141657cb379e8c7 Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.942329 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-e2928b0d65afd1490c68ee4146e41f14ddc4c79f8139e4a967ea3395d6003056 WatchSource:0}: Error finding container e2928b0d65afd1490c68ee4146e41f14ddc4c79f8139e4a967ea3395d6003056: Status 404 returned error can't find the container with id e2928b0d65afd1490c68ee4146e41f14ddc4c79f8139e4a967ea3395d6003056 Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.948769 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-8d6255ae1609dd498f34db136ab4458469eefec1cd11be2159812912e35d060e WatchSource:0}: Error finding container 8d6255ae1609dd498f34db136ab4458469eefec1cd11be2159812912e35d060e: Status 404 returned error can't find the container with id 8d6255ae1609dd498f34db136ab4458469eefec1cd11be2159812912e35d060e Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.954490 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-e7101db4deb5826b5c78762752c00e09400fe8c8bc07100d322767114aa4dde6 WatchSource:0}: Error finding container e7101db4deb5826b5c78762752c00e09400fe8c8bc07100d322767114aa4dde6: Status 404 returned error can't find the container with id e7101db4deb5826b5c78762752c00e09400fe8c8bc07100d322767114aa4dde6 Nov 22 04:07:29 crc kubenswrapper[4699]: W1122 04:07:29.959000 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-101e9dc9606d0742e30c2bb74cb3e17920523f09f4b3fe62bf3def7cf8d6b8c4 WatchSource:0}: Error finding container 101e9dc9606d0742e30c2bb74cb3e17920523f09f4b3fe62bf3def7cf8d6b8c4: Status 404 returned error can't find the container with id 101e9dc9606d0742e30c2bb74cb3e17920523f09f4b3fe62bf3def7cf8d6b8c4 Nov 22 04:07:29 crc kubenswrapper[4699]: E1122 04:07:29.990301 4699 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" interval="800ms" Nov 22 04:07:30 crc kubenswrapper[4699]: E1122 04:07:30.164424 4699 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.136:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187a389c9d4c21ba default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-22 04:07:29.37403641 +0000 UTC m=+0.716657597,LastTimestamp:2025-11-22 04:07:29.37403641 +0000 UTC m=+0.716657597,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 22 04:07:30 crc kubenswrapper[4699]: I1122 04:07:30.222825 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:30 crc kubenswrapper[4699]: I1122 04:07:30.224585 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:30 crc kubenswrapper[4699]: I1122 04:07:30.224629 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:30 crc kubenswrapper[4699]: I1122 04:07:30.224645 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:30 crc kubenswrapper[4699]: I1122 04:07:30.224670 4699 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 22 04:07:30 crc kubenswrapper[4699]: E1122 04:07:30.224967 4699 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.136:6443: connect: connection refused" node="crc" Nov 22 04:07:30 crc kubenswrapper[4699]: W1122 04:07:30.251683 4699 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Nov 22 04:07:30 crc kubenswrapper[4699]: E1122 04:07:30.251770 4699 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.136:6443: connect: connection refused" logger="UnhandledError" Nov 22 04:07:30 crc kubenswrapper[4699]: W1122 04:07:30.252940 4699 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Nov 22 04:07:30 crc kubenswrapper[4699]: E1122 04:07:30.253019 4699 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.136:6443: connect: connection refused" logger="UnhandledError" Nov 22 04:07:30 crc kubenswrapper[4699]: I1122 04:07:30.385056 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Nov 22 04:07:30 crc kubenswrapper[4699]: W1122 04:07:30.441365 4699 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Nov 22 04:07:30 crc kubenswrapper[4699]: E1122 04:07:30.441464 4699 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.136:6443: connect: connection refused" logger="UnhandledError" Nov 22 04:07:30 crc kubenswrapper[4699]: I1122 04:07:30.450958 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"101e9dc9606d0742e30c2bb74cb3e17920523f09f4b3fe62bf3def7cf8d6b8c4"} Nov 22 04:07:30 crc kubenswrapper[4699]: I1122 04:07:30.451838 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e7101db4deb5826b5c78762752c00e09400fe8c8bc07100d322767114aa4dde6"} Nov 22 04:07:30 crc kubenswrapper[4699]: I1122 04:07:30.452993 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"8d6255ae1609dd498f34db136ab4458469eefec1cd11be2159812912e35d060e"} Nov 22 04:07:30 crc kubenswrapper[4699]: I1122 04:07:30.456424 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e2928b0d65afd1490c68ee4146e41f14ddc4c79f8139e4a967ea3395d6003056"} Nov 22 04:07:30 crc kubenswrapper[4699]: I1122 04:07:30.457363 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"cf55632a0d47d58157666641ed0955f90ded0a08ee2037134141657cb379e8c7"} Nov 22 04:07:30 crc kubenswrapper[4699]: W1122 04:07:30.705682 4699 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Nov 22 04:07:30 crc kubenswrapper[4699]: E1122 04:07:30.705782 4699 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.136:6443: connect: connection refused" logger="UnhandledError" Nov 22 04:07:30 crc kubenswrapper[4699]: E1122 04:07:30.791962 4699 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" interval="1.6s" Nov 22 04:07:31 crc kubenswrapper[4699]: I1122 04:07:31.025057 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:31 crc kubenswrapper[4699]: I1122 04:07:31.026475 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:31 crc kubenswrapper[4699]: I1122 04:07:31.026507 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:31 crc kubenswrapper[4699]: I1122 04:07:31.026518 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:31 crc kubenswrapper[4699]: I1122 04:07:31.026545 4699 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 22 04:07:31 crc kubenswrapper[4699]: E1122 04:07:31.026959 4699 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.136:6443: connect: connection refused" node="crc" Nov 22 04:07:31 crc kubenswrapper[4699]: I1122 04:07:31.385119 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Nov 22 04:07:31 crc kubenswrapper[4699]: I1122 04:07:31.463685 4699 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="09d810e4209cf7eab6c6fbb0fedd46d64aee5d2b38b710e5bf19daa5515133f0" exitCode=0 Nov 22 04:07:31 crc kubenswrapper[4699]: I1122 04:07:31.463784 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:31 crc kubenswrapper[4699]: I1122 04:07:31.463832 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"09d810e4209cf7eab6c6fbb0fedd46d64aee5d2b38b710e5bf19daa5515133f0"} Nov 22 04:07:31 crc kubenswrapper[4699]: I1122 04:07:31.465349 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:31 crc kubenswrapper[4699]: I1122 04:07:31.465391 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:31 crc kubenswrapper[4699]: I1122 04:07:31.465404 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:31 crc kubenswrapper[4699]: I1122 04:07:31.467119 4699 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696" exitCode=0 Nov 22 04:07:31 crc kubenswrapper[4699]: I1122 04:07:31.467228 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696"} Nov 22 04:07:31 crc kubenswrapper[4699]: I1122 04:07:31.467412 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:31 crc kubenswrapper[4699]: I1122 04:07:31.468837 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:31 crc kubenswrapper[4699]: I1122 04:07:31.468871 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:31 crc kubenswrapper[4699]: I1122 04:07:31.468884 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:31 crc kubenswrapper[4699]: I1122 04:07:31.470611 4699 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="6860d3b5c86b1ad3bd55fc98a44e7fd84d66a5237df59f47319f598420b0241f" exitCode=0 Nov 22 04:07:31 crc kubenswrapper[4699]: I1122 04:07:31.470669 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"6860d3b5c86b1ad3bd55fc98a44e7fd84d66a5237df59f47319f598420b0241f"} Nov 22 04:07:31 crc kubenswrapper[4699]: I1122 04:07:31.470711 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:31 crc kubenswrapper[4699]: I1122 04:07:31.471647 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:31 crc kubenswrapper[4699]: I1122 04:07:31.471671 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:31 crc kubenswrapper[4699]: I1122 04:07:31.471680 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:31 crc kubenswrapper[4699]: I1122 04:07:31.476276 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3e4a053080810e22083dda4eaba1155b7b547a214158f849f7e5778f2e37ccc0"} Nov 22 04:07:31 crc kubenswrapper[4699]: I1122 04:07:31.476307 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7e1538d20749062691aa2368004d22a46e612186aee24cb92acc3ddb073f616a"} Nov 22 04:07:31 crc kubenswrapper[4699]: I1122 04:07:31.476319 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"545a27e66130160ef1d8557458a64a27f18292c157e2e6dab9aa75aea0532ee0"} Nov 22 04:07:31 crc kubenswrapper[4699]: I1122 04:07:31.476329 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"35e9c8adb3bd9249f6d7e57cd40e40951af0463e49765ba635707120d07e8b47"} Nov 22 04:07:31 crc kubenswrapper[4699]: I1122 04:07:31.476416 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:31 crc kubenswrapper[4699]: I1122 04:07:31.478024 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:31 crc kubenswrapper[4699]: I1122 04:07:31.478079 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:31 crc kubenswrapper[4699]: I1122 04:07:31.478096 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:31 crc kubenswrapper[4699]: I1122 04:07:31.480975 4699 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689" exitCode=0 Nov 22 04:07:31 crc kubenswrapper[4699]: I1122 04:07:31.481031 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689"} Nov 22 04:07:31 crc kubenswrapper[4699]: I1122 04:07:31.481171 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:31 crc kubenswrapper[4699]: I1122 04:07:31.482715 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:31 crc kubenswrapper[4699]: I1122 04:07:31.482751 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:31 crc kubenswrapper[4699]: I1122 04:07:31.482761 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:31 crc kubenswrapper[4699]: I1122 04:07:31.487684 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:31 crc kubenswrapper[4699]: I1122 04:07:31.489899 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:31 crc kubenswrapper[4699]: I1122 04:07:31.489936 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:31 crc kubenswrapper[4699]: I1122 04:07:31.489945 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:32 crc kubenswrapper[4699]: I1122 04:07:32.385091 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Nov 22 04:07:32 crc kubenswrapper[4699]: E1122 04:07:32.392731 4699 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.136:6443: connect: connection refused" interval="3.2s" Nov 22 04:07:32 crc kubenswrapper[4699]: I1122 04:07:32.484753 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"74d319bf4380d67d100b93621956d84606b64e3c4fe494e61dc658a4300bf124"} Nov 22 04:07:32 crc kubenswrapper[4699]: I1122 04:07:32.484858 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:32 crc kubenswrapper[4699]: I1122 04:07:32.485813 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:32 crc kubenswrapper[4699]: I1122 04:07:32.485841 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:32 crc kubenswrapper[4699]: I1122 04:07:32.485852 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:32 crc kubenswrapper[4699]: I1122 04:07:32.487759 4699 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8" exitCode=0 Nov 22 04:07:32 crc kubenswrapper[4699]: I1122 04:07:32.487811 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8"} Nov 22 04:07:32 crc kubenswrapper[4699]: I1122 04:07:32.487896 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:32 crc kubenswrapper[4699]: I1122 04:07:32.488689 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:32 crc kubenswrapper[4699]: I1122 04:07:32.488723 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:32 crc kubenswrapper[4699]: I1122 04:07:32.488738 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:32 crc kubenswrapper[4699]: I1122 04:07:32.491612 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"24e418cb4f331bd30b224110514a5d766e31fd949210ed6eb5ea3e1e04b2f62d"} Nov 22 04:07:32 crc kubenswrapper[4699]: I1122 04:07:32.491654 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"3141a4a35fe91db661f1bbb69f481d1db9302e79a16e9bc2898f2fd5fbe0f445"} Nov 22 04:07:32 crc kubenswrapper[4699]: I1122 04:07:32.493733 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0c8c1d8b6512002b090f6fa191cc3dc7d55aeae6d135bca5df2c367fb2a4f68c"} Nov 22 04:07:32 crc kubenswrapper[4699]: I1122 04:07:32.493759 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8e08c778826ca87eedf7169382d30509a5d31e132f5c91ff2cf633a24e3a7dcd"} Nov 22 04:07:32 crc kubenswrapper[4699]: I1122 04:07:32.493791 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:32 crc kubenswrapper[4699]: I1122 04:07:32.494545 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:32 crc kubenswrapper[4699]: I1122 04:07:32.494589 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:32 crc kubenswrapper[4699]: I1122 04:07:32.494607 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:32 crc kubenswrapper[4699]: W1122 04:07:32.568212 4699 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Nov 22 04:07:32 crc kubenswrapper[4699]: E1122 04:07:32.568289 4699 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.136:6443: connect: connection refused" logger="UnhandledError" Nov 22 04:07:32 crc kubenswrapper[4699]: I1122 04:07:32.627544 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:32 crc kubenswrapper[4699]: I1122 04:07:32.628904 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:32 crc kubenswrapper[4699]: I1122 04:07:32.628937 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:32 crc kubenswrapper[4699]: I1122 04:07:32.628946 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:32 crc kubenswrapper[4699]: I1122 04:07:32.628967 4699 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 22 04:07:32 crc kubenswrapper[4699]: E1122 04:07:32.629399 4699 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.136:6443: connect: connection refused" node="crc" Nov 22 04:07:33 crc kubenswrapper[4699]: W1122 04:07:33.016118 4699 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Nov 22 04:07:33 crc kubenswrapper[4699]: E1122 04:07:33.016590 4699 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.136:6443: connect: connection refused" logger="UnhandledError" Nov 22 04:07:33 crc kubenswrapper[4699]: W1122 04:07:33.158806 4699 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Nov 22 04:07:33 crc kubenswrapper[4699]: E1122 04:07:33.158885 4699 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.136:6443: connect: connection refused" logger="UnhandledError" Nov 22 04:07:33 crc kubenswrapper[4699]: W1122 04:07:33.184519 4699 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Nov 22 04:07:33 crc kubenswrapper[4699]: E1122 04:07:33.184590 4699 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.136:6443: connect: connection refused" logger="UnhandledError" Nov 22 04:07:33 crc kubenswrapper[4699]: I1122 04:07:33.385179 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.136:6443: connect: connection refused Nov 22 04:07:33 crc kubenswrapper[4699]: I1122 04:07:33.498226 4699 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4" exitCode=0 Nov 22 04:07:33 crc kubenswrapper[4699]: I1122 04:07:33.498378 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:33 crc kubenswrapper[4699]: I1122 04:07:33.498379 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4"} Nov 22 04:07:33 crc kubenswrapper[4699]: I1122 04:07:33.499595 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:33 crc kubenswrapper[4699]: I1122 04:07:33.499636 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:33 crc kubenswrapper[4699]: I1122 04:07:33.499652 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:33 crc kubenswrapper[4699]: I1122 04:07:33.501155 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"cb5d783b1e21eb55efe9affd3962651d2bc2f2345954fa40a00e5f9b481066fc"} Nov 22 04:07:33 crc kubenswrapper[4699]: I1122 04:07:33.501268 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:33 crc kubenswrapper[4699]: I1122 04:07:33.502478 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:33 crc kubenswrapper[4699]: I1122 04:07:33.502500 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:33 crc kubenswrapper[4699]: I1122 04:07:33.502510 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:33 crc kubenswrapper[4699]: I1122 04:07:33.504758 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:33 crc kubenswrapper[4699]: I1122 04:07:33.505214 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:33 crc kubenswrapper[4699]: I1122 04:07:33.505565 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7515e23a57d4ee6d0c28dec98dd1a2ef25aebe1071a17b5fdf9496d2deb76b8e"} Nov 22 04:07:33 crc kubenswrapper[4699]: I1122 04:07:33.505599 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0e25f8f28cc3aca76ae535aa6084bd1f994cbd0eb679f6ea40938a7fe456b0e6"} Nov 22 04:07:33 crc kubenswrapper[4699]: I1122 04:07:33.505611 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fb226d8acfbc46b2a51a6c4ef5c04c1e17d99e9e82bad5950ccb4356fcc39eba"} Nov 22 04:07:33 crc kubenswrapper[4699]: I1122 04:07:33.505897 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:33 crc kubenswrapper[4699]: I1122 04:07:33.505921 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:33 crc kubenswrapper[4699]: I1122 04:07:33.505929 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:33 crc kubenswrapper[4699]: I1122 04:07:33.506342 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:33 crc kubenswrapper[4699]: I1122 04:07:33.506366 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:33 crc kubenswrapper[4699]: I1122 04:07:33.506374 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:33 crc kubenswrapper[4699]: I1122 04:07:33.831939 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 04:07:33 crc kubenswrapper[4699]: I1122 04:07:33.832152 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:33 crc kubenswrapper[4699]: I1122 04:07:33.833505 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:33 crc kubenswrapper[4699]: I1122 04:07:33.833552 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:33 crc kubenswrapper[4699]: I1122 04:07:33.833565 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:33 crc kubenswrapper[4699]: I1122 04:07:33.841076 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 04:07:34 crc kubenswrapper[4699]: I1122 04:07:34.479937 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 04:07:34 crc kubenswrapper[4699]: I1122 04:07:34.517643 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"59408c7cd75594e068cdc4dadfec414fcc3d1604eea37ed708440fd1a4f019ee"} Nov 22 04:07:34 crc kubenswrapper[4699]: I1122 04:07:34.517711 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:34 crc kubenswrapper[4699]: I1122 04:07:34.517711 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:34 crc kubenswrapper[4699]: I1122 04:07:34.517721 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1996517d6f55ae1765dd9d101fede2963e7ac51a406bca35cab95fa45192623a"} Nov 22 04:07:34 crc kubenswrapper[4699]: I1122 04:07:34.518608 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 04:07:34 crc kubenswrapper[4699]: I1122 04:07:34.518650 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2cd4757f265f2b7a453efca645d83d5340e5ec206f6f9d40dd86010b90470498"} Nov 22 04:07:34 crc kubenswrapper[4699]: I1122 04:07:34.517676 4699 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 22 04:07:34 crc kubenswrapper[4699]: I1122 04:07:34.518793 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:34 crc kubenswrapper[4699]: I1122 04:07:34.518689 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2097cbd81d5aedb02fafaae3f17840da75ab455e541c410ae2f70710548530ec"} Nov 22 04:07:34 crc kubenswrapper[4699]: I1122 04:07:34.519857 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:34 crc kubenswrapper[4699]: I1122 04:07:34.519899 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:34 crc kubenswrapper[4699]: I1122 04:07:34.519924 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:34 crc kubenswrapper[4699]: I1122 04:07:34.519956 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:34 crc kubenswrapper[4699]: I1122 04:07:34.519930 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:34 crc kubenswrapper[4699]: I1122 04:07:34.520064 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:34 crc kubenswrapper[4699]: I1122 04:07:34.520421 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:34 crc kubenswrapper[4699]: I1122 04:07:34.520514 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:34 crc kubenswrapper[4699]: I1122 04:07:34.520539 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:35 crc kubenswrapper[4699]: I1122 04:07:35.526833 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"516e9231111cee4a53c71bef07338222497c8ffb27edbfaddbcb2e58af61ae7c"} Nov 22 04:07:35 crc kubenswrapper[4699]: I1122 04:07:35.526963 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:35 crc kubenswrapper[4699]: I1122 04:07:35.526970 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:35 crc kubenswrapper[4699]: I1122 04:07:35.528572 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:35 crc kubenswrapper[4699]: I1122 04:07:35.528620 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:35 crc kubenswrapper[4699]: I1122 04:07:35.528643 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:35 crc kubenswrapper[4699]: I1122 04:07:35.528663 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:35 crc kubenswrapper[4699]: I1122 04:07:35.528667 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:35 crc kubenswrapper[4699]: I1122 04:07:35.528689 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:35 crc kubenswrapper[4699]: I1122 04:07:35.685205 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Nov 22 04:07:35 crc kubenswrapper[4699]: I1122 04:07:35.830458 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:35 crc kubenswrapper[4699]: I1122 04:07:35.832381 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:35 crc kubenswrapper[4699]: I1122 04:07:35.832462 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:35 crc kubenswrapper[4699]: I1122 04:07:35.832477 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:35 crc kubenswrapper[4699]: I1122 04:07:35.832506 4699 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 22 04:07:36 crc kubenswrapper[4699]: I1122 04:07:36.280546 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 04:07:36 crc kubenswrapper[4699]: I1122 04:07:36.528964 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:36 crc kubenswrapper[4699]: I1122 04:07:36.528964 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:36 crc kubenswrapper[4699]: I1122 04:07:36.529895 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:36 crc kubenswrapper[4699]: I1122 04:07:36.529927 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:36 crc kubenswrapper[4699]: I1122 04:07:36.529939 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:36 crc kubenswrapper[4699]: I1122 04:07:36.530507 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:36 crc kubenswrapper[4699]: I1122 04:07:36.530537 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:36 crc kubenswrapper[4699]: I1122 04:07:36.530546 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:37 crc kubenswrapper[4699]: I1122 04:07:37.258008 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 04:07:37 crc kubenswrapper[4699]: I1122 04:07:37.258182 4699 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 22 04:07:37 crc kubenswrapper[4699]: I1122 04:07:37.258238 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:37 crc kubenswrapper[4699]: I1122 04:07:37.260094 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:37 crc kubenswrapper[4699]: I1122 04:07:37.260133 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:37 crc kubenswrapper[4699]: I1122 04:07:37.260149 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:37 crc kubenswrapper[4699]: I1122 04:07:37.532913 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:37 crc kubenswrapper[4699]: I1122 04:07:37.534286 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:37 crc kubenswrapper[4699]: I1122 04:07:37.534333 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:37 crc kubenswrapper[4699]: I1122 04:07:37.534355 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:37 crc kubenswrapper[4699]: I1122 04:07:37.648214 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 04:07:37 crc kubenswrapper[4699]: I1122 04:07:37.648485 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:37 crc kubenswrapper[4699]: I1122 04:07:37.649883 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:37 crc kubenswrapper[4699]: I1122 04:07:37.649934 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:37 crc kubenswrapper[4699]: I1122 04:07:37.649952 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:38 crc kubenswrapper[4699]: I1122 04:07:38.012320 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 04:07:38 crc kubenswrapper[4699]: I1122 04:07:38.012566 4699 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 22 04:07:38 crc kubenswrapper[4699]: I1122 04:07:38.012621 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:38 crc kubenswrapper[4699]: I1122 04:07:38.014404 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:38 crc kubenswrapper[4699]: I1122 04:07:38.014489 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:38 crc kubenswrapper[4699]: I1122 04:07:38.014530 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:39 crc kubenswrapper[4699]: I1122 04:07:39.216097 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 04:07:39 crc kubenswrapper[4699]: I1122 04:07:39.216246 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:39 crc kubenswrapper[4699]: I1122 04:07:39.218009 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:39 crc kubenswrapper[4699]: I1122 04:07:39.218099 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:39 crc kubenswrapper[4699]: I1122 04:07:39.218129 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:39 crc kubenswrapper[4699]: E1122 04:07:39.524002 4699 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 22 04:07:41 crc kubenswrapper[4699]: I1122 04:07:41.012766 4699 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 22 04:07:41 crc kubenswrapper[4699]: I1122 04:07:41.012864 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 22 04:07:42 crc kubenswrapper[4699]: I1122 04:07:42.768830 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Nov 22 04:07:42 crc kubenswrapper[4699]: I1122 04:07:42.769033 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:42 crc kubenswrapper[4699]: I1122 04:07:42.770479 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:42 crc kubenswrapper[4699]: I1122 04:07:42.770539 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:42 crc kubenswrapper[4699]: I1122 04:07:42.770557 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:44 crc kubenswrapper[4699]: I1122 04:07:44.386286 4699 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Nov 22 04:07:44 crc kubenswrapper[4699]: I1122 04:07:44.555597 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 22 04:07:44 crc kubenswrapper[4699]: I1122 04:07:44.558336 4699 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7515e23a57d4ee6d0c28dec98dd1a2ef25aebe1071a17b5fdf9496d2deb76b8e" exitCode=255 Nov 22 04:07:44 crc kubenswrapper[4699]: I1122 04:07:44.558384 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"7515e23a57d4ee6d0c28dec98dd1a2ef25aebe1071a17b5fdf9496d2deb76b8e"} Nov 22 04:07:44 crc kubenswrapper[4699]: I1122 04:07:44.558588 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:44 crc kubenswrapper[4699]: I1122 04:07:44.559829 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:44 crc kubenswrapper[4699]: I1122 04:07:44.559902 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:44 crc kubenswrapper[4699]: I1122 04:07:44.559912 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:44 crc kubenswrapper[4699]: I1122 04:07:44.560668 4699 scope.go:117] "RemoveContainer" containerID="7515e23a57d4ee6d0c28dec98dd1a2ef25aebe1071a17b5fdf9496d2deb76b8e" Nov 22 04:07:44 crc kubenswrapper[4699]: I1122 04:07:44.612580 4699 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Nov 22 04:07:44 crc kubenswrapper[4699]: I1122 04:07:44.612657 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Nov 22 04:07:44 crc kubenswrapper[4699]: I1122 04:07:44.629489 4699 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Nov 22 04:07:44 crc kubenswrapper[4699]: I1122 04:07:44.629548 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Nov 22 04:07:45 crc kubenswrapper[4699]: I1122 04:07:45.563397 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 22 04:07:45 crc kubenswrapper[4699]: I1122 04:07:45.564975 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f"} Nov 22 04:07:45 crc kubenswrapper[4699]: I1122 04:07:45.565159 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:45 crc kubenswrapper[4699]: I1122 04:07:45.566081 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:45 crc kubenswrapper[4699]: I1122 04:07:45.566135 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:45 crc kubenswrapper[4699]: I1122 04:07:45.566201 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:46 crc kubenswrapper[4699]: I1122 04:07:46.289409 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 04:07:46 crc kubenswrapper[4699]: I1122 04:07:46.567321 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:46 crc kubenswrapper[4699]: I1122 04:07:46.567401 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 04:07:46 crc kubenswrapper[4699]: I1122 04:07:46.568516 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:46 crc kubenswrapper[4699]: I1122 04:07:46.568578 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:46 crc kubenswrapper[4699]: I1122 04:07:46.568591 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:46 crc kubenswrapper[4699]: I1122 04:07:46.572811 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 04:07:47 crc kubenswrapper[4699]: I1122 04:07:47.570517 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:47 crc kubenswrapper[4699]: I1122 04:07:47.571885 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:47 crc kubenswrapper[4699]: I1122 04:07:47.571942 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:47 crc kubenswrapper[4699]: I1122 04:07:47.571962 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:48 crc kubenswrapper[4699]: I1122 04:07:48.572744 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:48 crc kubenswrapper[4699]: I1122 04:07:48.574564 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:48 crc kubenswrapper[4699]: I1122 04:07:48.574625 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:48 crc kubenswrapper[4699]: I1122 04:07:48.574639 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:49 crc kubenswrapper[4699]: I1122 04:07:49.220795 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 04:07:49 crc kubenswrapper[4699]: I1122 04:07:49.220976 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:49 crc kubenswrapper[4699]: I1122 04:07:49.222008 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:49 crc kubenswrapper[4699]: I1122 04:07:49.222048 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:49 crc kubenswrapper[4699]: I1122 04:07:49.222061 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:49 crc kubenswrapper[4699]: E1122 04:07:49.524149 4699 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 22 04:07:49 crc kubenswrapper[4699]: E1122 04:07:49.618033 4699 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Nov 22 04:07:49 crc kubenswrapper[4699]: I1122 04:07:49.632895 4699 trace.go:236] Trace[1740371748]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (22-Nov-2025 04:07:37.864) (total time: 11768ms): Nov 22 04:07:49 crc kubenswrapper[4699]: Trace[1740371748]: ---"Objects listed" error: 11768ms (04:07:49.632) Nov 22 04:07:49 crc kubenswrapper[4699]: Trace[1740371748]: [11.768720015s] [11.768720015s] END Nov 22 04:07:49 crc kubenswrapper[4699]: I1122 04:07:49.632936 4699 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Nov 22 04:07:49 crc kubenswrapper[4699]: E1122 04:07:49.633090 4699 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Nov 22 04:07:49 crc kubenswrapper[4699]: I1122 04:07:49.635715 4699 trace.go:236] Trace[1587804008]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (22-Nov-2025 04:07:39.199) (total time: 10435ms): Nov 22 04:07:49 crc kubenswrapper[4699]: Trace[1587804008]: ---"Objects listed" error: 10435ms (04:07:49.635) Nov 22 04:07:49 crc kubenswrapper[4699]: Trace[1587804008]: [10.435719742s] [10.435719742s] END Nov 22 04:07:49 crc kubenswrapper[4699]: I1122 04:07:49.635746 4699 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Nov 22 04:07:49 crc kubenswrapper[4699]: I1122 04:07:49.636021 4699 trace.go:236] Trace[1095869362]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (22-Nov-2025 04:07:38.061) (total time: 11574ms): Nov 22 04:07:49 crc kubenswrapper[4699]: Trace[1095869362]: ---"Objects listed" error: 11574ms (04:07:49.635) Nov 22 04:07:49 crc kubenswrapper[4699]: Trace[1095869362]: [11.574763586s] [11.574763586s] END Nov 22 04:07:49 crc kubenswrapper[4699]: I1122 04:07:49.636072 4699 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Nov 22 04:07:49 crc kubenswrapper[4699]: I1122 04:07:49.636181 4699 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Nov 22 04:07:49 crc kubenswrapper[4699]: I1122 04:07:49.652792 4699 trace.go:236] Trace[255635398]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (22-Nov-2025 04:07:37.955) (total time: 11697ms): Nov 22 04:07:49 crc kubenswrapper[4699]: Trace[255635398]: ---"Objects listed" error: 11696ms (04:07:49.652) Nov 22 04:07:49 crc kubenswrapper[4699]: Trace[255635398]: [11.69707858s] [11.69707858s] END Nov 22 04:07:49 crc kubenswrapper[4699]: I1122 04:07:49.653372 4699 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.229363 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.241100 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.378060 4699 apiserver.go:52] "Watching apiserver" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.381262 4699 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.381505 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.381922 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:07:50 crc kubenswrapper[4699]: E1122 04:07:50.381975 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.382022 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.382224 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:07:50 crc kubenswrapper[4699]: E1122 04:07:50.382249 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.382285 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:07:50 crc kubenswrapper[4699]: E1122 04:07:50.382307 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.382337 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.382550 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.384359 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.385688 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.385756 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.385857 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.386223 4699 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.387058 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.387380 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.388022 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.388064 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.388166 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.418580 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.437687 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.441241 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.441276 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.441296 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.441313 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.441332 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.441356 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.441372 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.441389 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.441447 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.441466 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.441483 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.441502 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.441519 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.441535 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.441551 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.441567 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.441646 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.441676 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.441707 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.441723 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.441741 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.441724 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.441770 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.441787 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.441735 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.441804 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.441880 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.441934 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.441975 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.442008 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.442020 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.442039 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.442075 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.442106 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.442138 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.442169 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.442198 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.442235 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.442267 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.442314 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.442351 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.442382 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.442412 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.442473 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.442508 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.442541 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.442566 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.442577 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.442619 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.442651 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.442683 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.442713 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.442744 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.442778 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.442809 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.442817 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.442843 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.442879 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.442910 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.442941 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.442971 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.443002 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.443034 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.443066 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.443098 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.443167 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.443198 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.443230 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.443262 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.443294 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.443327 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.443359 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.443391 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.443423 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.443487 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.443520 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.443555 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.443588 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.443625 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.443656 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.443688 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.443719 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.443752 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.443785 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.443817 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.443849 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.444086 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.444138 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.444187 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.444232 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.444269 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.444301 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.444332 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.444365 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.444398 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.444461 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.444495 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.444528 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.444560 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.444592 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.444624 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.444656 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.444697 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.444732 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.454765 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e855881-4d77-4655-b4d7-a50fc081f993\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545a27e66130160ef1d8557458a64a27f18292c157e2e6dab9aa75aea0532ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35e9c8adb3bd9249f6d7e57cd40e40951af0463e49765ba635707120d07e8b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1538d20749062691aa2368004d22a46e612186aee24cb92acc3ddb073f616a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4a053080810e22083dda4eaba1155b7b547a214158f849f7e5778f2e37ccc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.443515 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.463942 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.443561 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.444124 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.444192 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.444668 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.444699 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: E1122 04:07:50.444764 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:07:50.944733131 +0000 UTC m=+22.287354378 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.464398 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.464511 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.445094 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.445506 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.445927 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.445503 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.451776 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.451962 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.452144 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.452168 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.452294 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.452418 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.465013 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.465092 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.465325 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.452522 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.452525 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.452552 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.452545 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.452577 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.452778 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.452812 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.452884 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.453098 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.453149 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.453178 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.453195 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.453212 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.453392 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.453425 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.453608 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.465687 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.453706 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.453898 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.454254 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.454582 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.454895 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.454997 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.455041 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.455072 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.455100 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.455364 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.455553 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.455555 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.455927 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.455975 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.456154 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.456370 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.456413 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.458063 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.458176 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.458545 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.458904 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.458962 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.459270 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.459361 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.460016 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.460041 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.460167 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.460195 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.460298 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.460241 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.460474 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.460533 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.460586 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.460657 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.460828 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.461330 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.461354 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.461391 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.461461 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.461524 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.461936 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.461986 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.462717 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.463390 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.444999 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.466470 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.466514 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.466543 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.466568 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.466602 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.466632 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.466661 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.466681 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.466700 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.466722 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.466739 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.466758 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.466781 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.466803 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.466799 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.466886 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.466929 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.467123 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.467562 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.467575 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.467599 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.467780 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.467327 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.467831 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.467875 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.467952 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.468006 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.468201 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.468275 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.468343 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.468386 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.468456 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.468502 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.468785 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.468872 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.468991 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.468815 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.469727 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.470095 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.470202 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.469757 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.471382 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.471594 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.471644 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.470354 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.471718 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.471886 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.471934 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.472057 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.470191 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.471082 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.472332 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.472398 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.472592 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.472751 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.472768 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.472809 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.472765 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.473227 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.473262 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.473174 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.473248 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.473287 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.473394 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.473466 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.473522 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.473540 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.473550 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.473592 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.473616 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.473639 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.473664 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.473683 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.473767 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.473823 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.473841 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.473854 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.473927 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.473953 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.473976 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.473996 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.474014 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.474008 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.474168 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.474215 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.474261 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.474267 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.474295 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.474326 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.474421 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.474641 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.474702 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.474710 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.474719 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.474749 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.475095 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.475159 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.475180 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.475241 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.475262 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.475284 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.475295 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.475302 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.475383 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.475494 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.475562 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.475598 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.475624 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.475650 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.475676 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.475825 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.475878 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.475919 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.475958 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.475997 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.476233 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.476270 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.476296 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.476315 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.476333 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.476352 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.476370 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.476390 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.476408 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.476612 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.476720 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.477037 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.477375 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.477408 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.477443 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.477462 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.477481 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.477505 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.477525 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.477545 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.477565 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.477586 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.477606 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.477626 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.477645 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.477694 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.477722 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.477745 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.477773 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.477807 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.477831 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.477854 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.477876 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.477900 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.477931 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.477955 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.477978 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.477998 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478017 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478106 4699 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478117 4699 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478129 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478149 4699 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478159 4699 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478170 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478181 4699 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478191 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478201 4699 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478211 4699 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478221 4699 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478231 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478240 4699 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478252 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478264 4699 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478275 4699 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478285 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478295 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478305 4699 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478317 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478327 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478337 4699 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478345 4699 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478355 4699 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478365 4699 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478374 4699 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478384 4699 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478394 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478405 4699 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478414 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478425 4699 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478454 4699 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478464 4699 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478475 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478486 4699 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478495 4699 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478505 4699 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478514 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478524 4699 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478534 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478543 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478552 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478562 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478571 4699 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478581 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478592 4699 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478627 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478650 4699 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478664 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478679 4699 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478691 4699 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478703 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478714 4699 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478727 4699 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478742 4699 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478755 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478769 4699 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478784 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478795 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478806 4699 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478816 4699 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478827 4699 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478838 4699 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478847 4699 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478857 4699 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478868 4699 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478878 4699 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478887 4699 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478901 4699 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478911 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478922 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478933 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478943 4699 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478953 4699 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478963 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478972 4699 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478982 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.478993 4699 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479004 4699 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479014 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479023 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479035 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479045 4699 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479054 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479064 4699 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479074 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479085 4699 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479095 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479105 4699 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479116 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479126 4699 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479162 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479172 4699 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479182 4699 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479191 4699 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479200 4699 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479210 4699 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479221 4699 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479231 4699 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479241 4699 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479252 4699 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479262 4699 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479273 4699 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479285 4699 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479296 4699 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479308 4699 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479348 4699 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479379 4699 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479401 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479419 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479473 4699 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479489 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479503 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479517 4699 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479531 4699 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479548 4699 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479547 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479562 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479617 4699 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479640 4699 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479661 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479658 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479680 4699 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479778 4699 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479798 4699 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479814 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479826 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479837 4699 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479848 4699 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479859 4699 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479869 4699 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479879 4699 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479889 4699 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479901 4699 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479891 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479912 4699 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479535 4699 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.480002 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.480067 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.480163 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: E1122 04:07:50.480279 4699 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.482566 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 04:07:50 crc kubenswrapper[4699]: E1122 04:07:50.483813 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 04:07:50.983785237 +0000 UTC m=+22.326406444 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.480568 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 04:07:50 crc kubenswrapper[4699]: E1122 04:07:50.480830 4699 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.481717 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.479926 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.480987 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.481069 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.481271 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.481543 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: E1122 04:07:50.483923 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 04:07:50.98388764 +0000 UTC m=+22.326508927 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.481584 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.482103 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.482744 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.484423 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.484475 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.484537 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.484552 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.484796 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.485038 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.486067 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.486198 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.486781 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.487443 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.487484 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.487705 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.488922 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.488966 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.489277 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.492301 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.493855 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 04:07:50 crc kubenswrapper[4699]: E1122 04:07:50.494142 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 04:07:50 crc kubenswrapper[4699]: E1122 04:07:50.494423 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 04:07:50 crc kubenswrapper[4699]: E1122 04:07:50.494509 4699 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:07:50 crc kubenswrapper[4699]: E1122 04:07:50.494579 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-22 04:07:50.994558348 +0000 UTC m=+22.337179535 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.496009 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 04:07:50 crc kubenswrapper[4699]: E1122 04:07:50.496248 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 04:07:50 crc kubenswrapper[4699]: E1122 04:07:50.496706 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.496698 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: E1122 04:07:50.496724 4699 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.496695 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: E1122 04:07:50.496780 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-22 04:07:50.996760402 +0000 UTC m=+22.339381579 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.496280 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.497043 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.497260 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.497355 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.498296 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.498444 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.501499 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.501953 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.502051 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.505377 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.505597 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.505670 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.505640 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.505838 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.505901 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.506419 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.508324 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.508988 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.509576 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.509948 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.510029 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.510061 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.510075 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.510148 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.510505 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.510716 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.510541 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.510942 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.511549 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.512529 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.516091 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.517092 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.527752 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.528405 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.539402 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.541687 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.552858 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e855881-4d77-4655-b4d7-a50fc081f993\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545a27e66130160ef1d8557458a64a27f18292c157e2e6dab9aa75aea0532ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35e9c8adb3bd9249f6d7e57cd40e40951af0463e49765ba635707120d07e8b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1538d20749062691aa2368004d22a46e612186aee24cb92acc3ddb073f616a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4a053080810e22083dda4eaba1155b7b547a214158f849f7e5778f2e37ccc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.569077 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.580617 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.580725 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.580781 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.580855 4699 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.580875 4699 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.580897 4699 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.580909 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.580923 4699 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.580934 4699 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.580947 4699 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.580958 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.580969 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.580982 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.580994 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581005 4699 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581017 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581027 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581038 4699 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581051 4699 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581062 4699 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581074 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581088 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581097 4699 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581107 4699 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581118 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581129 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581141 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581151 4699 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581160 4699 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581172 4699 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581182 4699 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581193 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581207 4699 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581222 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581236 4699 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581251 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581264 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581275 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581285 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581296 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581308 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581319 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581334 4699 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581350 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581363 4699 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581378 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581389 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581399 4699 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581410 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581421 4699 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581467 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581481 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581490 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581501 4699 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581516 4699 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581530 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581541 4699 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581554 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581567 4699 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581582 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581595 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581607 4699 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581618 4699 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581629 4699 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.581212 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.582572 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:50 crc kubenswrapper[4699]: E1122 04:07:50.585348 4699 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.592752 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.622189 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.646216 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.666753 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.696994 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.703805 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.708496 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 04:07:50 crc kubenswrapper[4699]: W1122 04:07:50.718817 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-40a5827214f4570744ce717b8e3301561e7275e88606334de1e79d50f6245dd2 WatchSource:0}: Error finding container 40a5827214f4570744ce717b8e3301561e7275e88606334de1e79d50f6245dd2: Status 404 returned error can't find the container with id 40a5827214f4570744ce717b8e3301561e7275e88606334de1e79d50f6245dd2 Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.985126 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.985193 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.985223 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:07:50 crc kubenswrapper[4699]: E1122 04:07:50.985311 4699 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 04:07:50 crc kubenswrapper[4699]: E1122 04:07:50.985355 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 04:07:51.985341838 +0000 UTC m=+23.327963025 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 04:07:50 crc kubenswrapper[4699]: E1122 04:07:50.985402 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:07:51.985397169 +0000 UTC m=+23.328018356 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:07:50 crc kubenswrapper[4699]: E1122 04:07:50.985472 4699 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 04:07:50 crc kubenswrapper[4699]: E1122 04:07:50.985493 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 04:07:51.985487632 +0000 UTC m=+23.328108819 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.997388 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-86ztb"] Nov 22 04:07:50 crc kubenswrapper[4699]: I1122 04:07:50.997711 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-86ztb" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.000857 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.000931 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.005511 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.024212 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e855881-4d77-4655-b4d7-a50fc081f993\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545a27e66130160ef1d8557458a64a27f18292c157e2e6dab9aa75aea0532ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35e9c8adb3bd9249f6d7e57cd40e40951af0463e49765ba635707120d07e8b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1538d20749062691aa2368004d22a46e612186aee24cb92acc3ddb073f616a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4a053080810e22083dda4eaba1155b7b547a214158f849f7e5778f2e37ccc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.025033 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-h6ndp"] Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.025326 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-h6ndp" Nov 22 04:07:51 crc kubenswrapper[4699]: W1122 04:07:51.026680 4699 reflector.go:561] object-"openshift-image-registry"/"node-ca-dockercfg-4777p": failed to list *v1.Secret: secrets "node-ca-dockercfg-4777p" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-image-registry": no relationship found between node 'crc' and this object Nov 22 04:07:51 crc kubenswrapper[4699]: E1122 04:07:51.026713 4699 reflector.go:158] "Unhandled Error" err="object-\"openshift-image-registry\"/\"node-ca-dockercfg-4777p\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"node-ca-dockercfg-4777p\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-image-registry\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 22 04:07:51 crc kubenswrapper[4699]: W1122 04:07:51.027205 4699 reflector.go:561] object-"openshift-image-registry"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-image-registry": no relationship found between node 'crc' and this object Nov 22 04:07:51 crc kubenswrapper[4699]: E1122 04:07:51.027285 4699 reflector.go:158] "Unhandled Error" err="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-image-registry\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 22 04:07:51 crc kubenswrapper[4699]: W1122 04:07:51.027328 4699 reflector.go:561] object-"openshift-image-registry"/"image-registry-certificates": failed to list *v1.ConfigMap: configmaps "image-registry-certificates" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-image-registry": no relationship found between node 'crc' and this object Nov 22 04:07:51 crc kubenswrapper[4699]: E1122 04:07:51.027344 4699 reflector.go:158] "Unhandled Error" err="object-\"openshift-image-registry\"/\"image-registry-certificates\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"image-registry-certificates\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-image-registry\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 22 04:07:51 crc kubenswrapper[4699]: W1122 04:07:51.027468 4699 reflector.go:561] object-"openshift-image-registry"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-image-registry": no relationship found between node 'crc' and this object Nov 22 04:07:51 crc kubenswrapper[4699]: E1122 04:07:51.027489 4699 reflector.go:158] "Unhandled Error" err="object-\"openshift-image-registry\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-image-registry\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.064729 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.078807 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.086220 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.086272 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:07:51 crc kubenswrapper[4699]: E1122 04:07:51.086464 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 04:07:51 crc kubenswrapper[4699]: E1122 04:07:51.086492 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 04:07:51 crc kubenswrapper[4699]: E1122 04:07:51.086511 4699 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:07:51 crc kubenswrapper[4699]: E1122 04:07:51.086543 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 04:07:51 crc kubenswrapper[4699]: E1122 04:07:51.086591 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-22 04:07:52.08656423 +0000 UTC m=+23.429185417 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:07:51 crc kubenswrapper[4699]: E1122 04:07:51.086596 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 04:07:51 crc kubenswrapper[4699]: E1122 04:07:51.086622 4699 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:07:51 crc kubenswrapper[4699]: E1122 04:07:51.086710 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-22 04:07:52.086685123 +0000 UTC m=+23.429306310 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.088713 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86ztb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d15248-9724-41b0-8370-66127cc18bbe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-799vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86ztb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.101577 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.118724 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.133180 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.148522 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.158362 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h6ndp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd066499-5bd5-459c-8a02-d02f716c8965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hhkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h6ndp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.168126 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.179994 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.186947 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dd066499-5bd5-459c-8a02-d02f716c8965-host\") pod \"node-ca-h6ndp\" (UID: \"dd066499-5bd5-459c-8a02-d02f716c8965\") " pod="openshift-image-registry/node-ca-h6ndp" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.186996 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/dd066499-5bd5-459c-8a02-d02f716c8965-serviceca\") pod \"node-ca-h6ndp\" (UID: \"dd066499-5bd5-459c-8a02-d02f716c8965\") " pod="openshift-image-registry/node-ca-h6ndp" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.187015 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hhkm\" (UniqueName: \"kubernetes.io/projected/dd066499-5bd5-459c-8a02-d02f716c8965-kube-api-access-9hhkm\") pod \"node-ca-h6ndp\" (UID: \"dd066499-5bd5-459c-8a02-d02f716c8965\") " pod="openshift-image-registry/node-ca-h6ndp" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.187110 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/34d15248-9724-41b0-8370-66127cc18bbe-hosts-file\") pod \"node-resolver-86ztb\" (UID: \"34d15248-9724-41b0-8370-66127cc18bbe\") " pod="openshift-dns/node-resolver-86ztb" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.187131 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-799vb\" (UniqueName: \"kubernetes.io/projected/34d15248-9724-41b0-8370-66127cc18bbe-kube-api-access-799vb\") pod \"node-resolver-86ztb\" (UID: \"34d15248-9724-41b0-8370-66127cc18bbe\") " pod="openshift-dns/node-resolver-86ztb" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.210021 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.236015 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.251351 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e855881-4d77-4655-b4d7-a50fc081f993\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545a27e66130160ef1d8557458a64a27f18292c157e2e6dab9aa75aea0532ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35e9c8adb3bd9249f6d7e57cd40e40951af0463e49765ba635707120d07e8b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1538d20749062691aa2368004d22a46e612186aee24cb92acc3ddb073f616a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4a053080810e22083dda4eaba1155b7b547a214158f849f7e5778f2e37ccc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.267754 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.288168 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dd066499-5bd5-459c-8a02-d02f716c8965-host\") pod \"node-ca-h6ndp\" (UID: \"dd066499-5bd5-459c-8a02-d02f716c8965\") " pod="openshift-image-registry/node-ca-h6ndp" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.288608 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/dd066499-5bd5-459c-8a02-d02f716c8965-serviceca\") pod \"node-ca-h6ndp\" (UID: \"dd066499-5bd5-459c-8a02-d02f716c8965\") " pod="openshift-image-registry/node-ca-h6ndp" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.288297 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dd066499-5bd5-459c-8a02-d02f716c8965-host\") pod \"node-ca-h6ndp\" (UID: \"dd066499-5bd5-459c-8a02-d02f716c8965\") " pod="openshift-image-registry/node-ca-h6ndp" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.288722 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hhkm\" (UniqueName: \"kubernetes.io/projected/dd066499-5bd5-459c-8a02-d02f716c8965-kube-api-access-9hhkm\") pod \"node-ca-h6ndp\" (UID: \"dd066499-5bd5-459c-8a02-d02f716c8965\") " pod="openshift-image-registry/node-ca-h6ndp" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.288881 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-799vb\" (UniqueName: \"kubernetes.io/projected/34d15248-9724-41b0-8370-66127cc18bbe-kube-api-access-799vb\") pod \"node-resolver-86ztb\" (UID: \"34d15248-9724-41b0-8370-66127cc18bbe\") " pod="openshift-dns/node-resolver-86ztb" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.288914 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/34d15248-9724-41b0-8370-66127cc18bbe-hosts-file\") pod \"node-resolver-86ztb\" (UID: \"34d15248-9724-41b0-8370-66127cc18bbe\") " pod="openshift-dns/node-resolver-86ztb" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.288989 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/34d15248-9724-41b0-8370-66127cc18bbe-hosts-file\") pod \"node-resolver-86ztb\" (UID: \"34d15248-9724-41b0-8370-66127cc18bbe\") " pod="openshift-dns/node-resolver-86ztb" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.289710 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.304722 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86ztb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d15248-9724-41b0-8370-66127cc18bbe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-799vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86ztb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.311711 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-799vb\" (UniqueName: \"kubernetes.io/projected/34d15248-9724-41b0-8370-66127cc18bbe-kube-api-access-799vb\") pod \"node-resolver-86ztb\" (UID: \"34d15248-9724-41b0-8370-66127cc18bbe\") " pod="openshift-dns/node-resolver-86ztb" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.318299 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-86ztb" Nov 22 04:07:51 crc kubenswrapper[4699]: W1122 04:07:51.335796 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34d15248_9724_41b0_8370_66127cc18bbe.slice/crio-615bbc6a813b41e266c1d921497ce7a9def61b7cb2d8e86f58faac6006de66d2 WatchSource:0}: Error finding container 615bbc6a813b41e266c1d921497ce7a9def61b7cb2d8e86f58faac6006de66d2: Status 404 returned error can't find the container with id 615bbc6a813b41e266c1d921497ce7a9def61b7cb2d8e86f58faac6006de66d2 Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.453678 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.454371 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.457073 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.457872 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.459219 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.459972 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.461393 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.462810 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.463773 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.465336 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.465978 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.468156 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.468862 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.469543 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.471665 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.472410 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.473316 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.474340 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.474989 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.477303 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.480534 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.482061 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.487492 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.488227 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.496493 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.497367 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.500589 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.501122 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.502005 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.503967 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.504501 4699 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.504628 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.506749 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.507243 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.507718 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.511092 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.511852 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.512344 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.513934 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.515022 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.515839 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.516784 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.517977 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.518991 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.519491 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.520423 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.520968 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.522154 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.522685 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.523561 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.524054 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.525010 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.526028 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.526514 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.589005 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.590339 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.594815 4699 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f" exitCode=255 Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.595006 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f"} Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.595136 4699 scope.go:117] "RemoveContainer" containerID="7515e23a57d4ee6d0c28dec98dd1a2ef25aebe1071a17b5fdf9496d2deb76b8e" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.597174 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-86ztb" event={"ID":"34d15248-9724-41b0-8370-66127cc18bbe","Type":"ContainerStarted","Data":"615bbc6a813b41e266c1d921497ce7a9def61b7cb2d8e86f58faac6006de66d2"} Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.598724 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"233bb6f47621b82f452c290c5ca4694792f6adbd22a443befebc139a047c189f"} Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.600814 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"99bfafe09aabfb9e3715d3c7af12849e0c8cb66e5799011c8463c5043383fee2"} Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.600879 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"40a5827214f4570744ce717b8e3301561e7275e88606334de1e79d50f6245dd2"} Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.602656 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"c858c4eaa869f479d0fbd62eadd41218ca8dddc7ae5ffd82d36977acde2e76ce"} Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.603247 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"0a495f59745e7c7adfbf7fc8225673a833bb300b3b0d43a9ffb488b53b74d63c"} Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.611575 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.625807 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.638575 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.648228 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.656764 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h6ndp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd066499-5bd5-459c-8a02-d02f716c8965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hhkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h6ndp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.666751 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.674027 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.674396 4699 scope.go:117] "RemoveContainer" containerID="a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f" Nov 22 04:07:51 crc kubenswrapper[4699]: E1122 04:07:51.674786 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.676713 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.684833 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86ztb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d15248-9724-41b0-8370-66127cc18bbe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-799vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86ztb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.697359 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e855881-4d77-4655-b4d7-a50fc081f993\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545a27e66130160ef1d8557458a64a27f18292c157e2e6dab9aa75aea0532ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35e9c8adb3bd9249f6d7e57cd40e40951af0463e49765ba635707120d07e8b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1538d20749062691aa2368004d22a46e612186aee24cb92acc3ddb073f616a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4a053080810e22083dda4eaba1155b7b547a214158f849f7e5778f2e37ccc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.708369 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.721023 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.736289 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.747676 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.757485 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h6ndp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd066499-5bd5-459c-8a02-d02f716c8965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hhkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h6ndp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.772387 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653394-4b4d-4c44-bc9d-39f2eeadbee4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e08c778826ca87eedf7169382d30509a5d31e132f5c91ff2cf633a24e3a7dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb226d8acfbc46b2a51a6c4ef5c04c1e17d99e9e82bad5950ccb4356fcc39eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c1d8b6512002b090f6fa191cc3dc7d55aeae6d135bca5df2c367fb2a4f68c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7515e23a57d4ee6d0c28dec98dd1a2ef25aebe1071a17b5fdf9496d2deb76b8e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:07:44Z\\\",\\\"message\\\":\\\"W1122 04:07:33.335076 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 04:07:33.335386 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763784453 cert, and key in /tmp/serving-cert-3900859243/serving-signer.crt, /tmp/serving-cert-3900859243/serving-signer.key\\\\nI1122 04:07:33.880107 1 observer_polling.go:159] Starting file observer\\\\nW1122 04:07:33.882546 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 04:07:33.882716 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:07:33.883481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3900859243/tls.crt::/tmp/serving-cert-3900859243/tls.key\\\\\\\"\\\\nF1122 04:07:44.506415 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 04:07:50.127900 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 04:07:50.128059 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:07:50.128926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2923111326/tls.crt::/tmp/serving-cert-2923111326/tls.key\\\\\\\"\\\\nI1122 04:07:50.418529 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:07:50.432499 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:07:50.432593 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:07:50.432650 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:07:50.432686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:07:50.439773 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1122 04:07:50.439810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 04:07:50.439829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439834 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:07:50.439842 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:07:50.439844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:07:50.439864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:07:50.442112 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e25f8f28cc3aca76ae535aa6084bd1f994cbd0eb679f6ea40938a7fe456b0e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.789051 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c858c4eaa869f479d0fbd62eadd41218ca8dddc7ae5ffd82d36977acde2e76ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.807411 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.820142 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86ztb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d15248-9724-41b0-8370-66127cc18bbe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-799vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86ztb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.835655 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e855881-4d77-4655-b4d7-a50fc081f993\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545a27e66130160ef1d8557458a64a27f18292c157e2e6dab9aa75aea0532ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35e9c8adb3bd9249f6d7e57cd40e40951af0463e49765ba635707120d07e8b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1538d20749062691aa2368004d22a46e612186aee24cb92acc3ddb073f616a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4a053080810e22083dda4eaba1155b7b547a214158f849f7e5778f2e37ccc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.839098 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-kjwnt"] Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.839691 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.841450 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-pmtb4"] Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.841751 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pmtb4" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.842462 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-b7225"] Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.843004 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.843059 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-b7225" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.843212 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.843212 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.843648 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.844738 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-z7552"] Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.845064 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.852210 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.853012 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.853028 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.853066 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.853090 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.853172 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.853187 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.853090 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.854057 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.856784 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.857049 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.857841 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.858069 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.858572 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.858748 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.859124 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.867233 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.879417 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.889786 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e855881-4d77-4655-b4d7-a50fc081f993\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545a27e66130160ef1d8557458a64a27f18292c157e2e6dab9aa75aea0532ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35e9c8adb3bd9249f6d7e57cd40e40951af0463e49765ba635707120d07e8b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1538d20749062691aa2368004d22a46e612186aee24cb92acc3ddb073f616a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4a053080810e22083dda4eaba1155b7b547a214158f849f7e5778f2e37ccc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.896873 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86ztb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d15248-9724-41b0-8370-66127cc18bbe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-799vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86ztb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.904827 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h6ndp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd066499-5bd5-459c-8a02-d02f716c8965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hhkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h6ndp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.914197 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bdbae2-706a-4f84-9f56-5a42aec77762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kjwnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.926172 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653394-4b4d-4c44-bc9d-39f2eeadbee4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e08c778826ca87eedf7169382d30509a5d31e132f5c91ff2cf633a24e3a7dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb226d8acfbc46b2a51a6c4ef5c04c1e17d99e9e82bad5950ccb4356fcc39eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c1d8b6512002b090f6fa191cc3dc7d55aeae6d135bca5df2c367fb2a4f68c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7515e23a57d4ee6d0c28dec98dd1a2ef25aebe1071a17b5fdf9496d2deb76b8e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:07:44Z\\\",\\\"message\\\":\\\"W1122 04:07:33.335076 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 04:07:33.335386 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763784453 cert, and key in /tmp/serving-cert-3900859243/serving-signer.crt, /tmp/serving-cert-3900859243/serving-signer.key\\\\nI1122 04:07:33.880107 1 observer_polling.go:159] Starting file observer\\\\nW1122 04:07:33.882546 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 04:07:33.882716 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:07:33.883481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3900859243/tls.crt::/tmp/serving-cert-3900859243/tls.key\\\\\\\"\\\\nF1122 04:07:44.506415 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 04:07:50.127900 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 04:07:50.128059 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:07:50.128926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2923111326/tls.crt::/tmp/serving-cert-2923111326/tls.key\\\\\\\"\\\\nI1122 04:07:50.418529 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:07:50.432499 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:07:50.432593 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:07:50.432650 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:07:50.432686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:07:50.439773 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1122 04:07:50.439810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 04:07:50.439829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439834 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:07:50.439842 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:07:50.439844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:07:50.439864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:07:50.442112 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e25f8f28cc3aca76ae535aa6084bd1f994cbd0eb679f6ea40938a7fe456b0e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.935490 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.944765 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.954365 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c858c4eaa869f479d0fbd62eadd41218ca8dddc7ae5ffd82d36977acde2e76ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.964908 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.976274 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c858c4eaa869f479d0fbd62eadd41218ca8dddc7ae5ffd82d36977acde2e76ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.985045 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.993977 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pmtb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccx9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pmtb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.994056 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.994144 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-cni-binary-copy\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:51 crc kubenswrapper[4699]: E1122 04:07:51.994188 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:07:53.994171729 +0000 UTC m=+25.336792916 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.994245 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.994265 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-etc-kubernetes\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.994281 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccx9x\" (UniqueName: \"kubernetes.io/projected/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-kube-api-access-ccx9x\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.994298 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-host-slash\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.994314 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-ovn-node-metrics-cert\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.994331 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/41bdbae2-706a-4f84-9f56-5a42aec77762-rootfs\") pod \"machine-config-daemon-kjwnt\" (UID: \"41bdbae2-706a-4f84-9f56-5a42aec77762\") " pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.994354 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e5e536a-6797-4e6f-8160-1e23ddda1647-system-cni-dir\") pod \"multus-additional-cni-plugins-b7225\" (UID: \"7e5e536a-6797-4e6f-8160-1e23ddda1647\") " pod="openshift-multus/multus-additional-cni-plugins-b7225" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.994369 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-system-cni-dir\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.994388 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7e5e536a-6797-4e6f-8160-1e23ddda1647-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b7225\" (UID: \"7e5e536a-6797-4e6f-8160-1e23ddda1647\") " pod="openshift-multus/multus-additional-cni-plugins-b7225" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.994417 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-hostroot\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.994475 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-ovnkube-config\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.994507 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbkvv\" (UniqueName: \"kubernetes.io/projected/7e5e536a-6797-4e6f-8160-1e23ddda1647-kube-api-access-vbkvv\") pod \"multus-additional-cni-plugins-b7225\" (UID: \"7e5e536a-6797-4e6f-8160-1e23ddda1647\") " pod="openshift-multus/multus-additional-cni-plugins-b7225" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.994521 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-etc-openvswitch\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.994547 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-multus-socket-dir-parent\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.994574 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7e5e536a-6797-4e6f-8160-1e23ddda1647-cni-binary-copy\") pod \"multus-additional-cni-plugins-b7225\" (UID: \"7e5e536a-6797-4e6f-8160-1e23ddda1647\") " pod="openshift-multus/multus-additional-cni-plugins-b7225" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.994596 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-run-systemd\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.994613 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/41bdbae2-706a-4f84-9f56-5a42aec77762-proxy-tls\") pod \"machine-config-daemon-kjwnt\" (UID: \"41bdbae2-706a-4f84-9f56-5a42aec77762\") " pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.994633 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-node-log\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.994651 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-host-cni-bin\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.994669 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7e5e536a-6797-4e6f-8160-1e23ddda1647-cnibin\") pod \"multus-additional-cni-plugins-b7225\" (UID: \"7e5e536a-6797-4e6f-8160-1e23ddda1647\") " pod="openshift-multus/multus-additional-cni-plugins-b7225" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.994693 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-var-lib-openvswitch\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.994735 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-host-run-netns\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.994756 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-host-run-multus-certs\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.994783 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:07:51 crc kubenswrapper[4699]: E1122 04:07:51.994834 4699 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.994871 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km2cd\" (UniqueName: \"kubernetes.io/projected/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-kube-api-access-km2cd\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.994930 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtp5d\" (UniqueName: \"kubernetes.io/projected/41bdbae2-706a-4f84-9f56-5a42aec77762-kube-api-access-dtp5d\") pod \"machine-config-daemon-kjwnt\" (UID: \"41bdbae2-706a-4f84-9f56-5a42aec77762\") " pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" Nov 22 04:07:51 crc kubenswrapper[4699]: E1122 04:07:51.994949 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 04:07:53.994939827 +0000 UTC m=+25.337561014 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.994969 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-multus-daemon-config\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.994989 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-env-overrides\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.995005 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/41bdbae2-706a-4f84-9f56-5a42aec77762-mcd-auth-proxy-config\") pod \"machine-config-daemon-kjwnt\" (UID: \"41bdbae2-706a-4f84-9f56-5a42aec77762\") " pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.995025 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-host-var-lib-cni-bin\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.995044 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-host-var-lib-kubelet\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.995072 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7e5e536a-6797-4e6f-8160-1e23ddda1647-os-release\") pod \"multus-additional-cni-plugins-b7225\" (UID: \"7e5e536a-6797-4e6f-8160-1e23ddda1647\") " pod="openshift-multus/multus-additional-cni-plugins-b7225" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.995096 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-run-ovn\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.995115 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-host-run-netns\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.995165 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-run-openvswitch\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.995215 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-log-socket\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.995242 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-host-run-ovn-kubernetes\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.995274 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-host-kubelet\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.995294 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-systemd-units\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.995314 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-multus-cni-dir\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.995346 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-cnibin\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.995391 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7e5e536a-6797-4e6f-8160-1e23ddda1647-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b7225\" (UID: \"7e5e536a-6797-4e6f-8160-1e23ddda1647\") " pod="openshift-multus/multus-additional-cni-plugins-b7225" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.995417 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-os-release\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.995483 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-ovnkube-script-lib\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.995527 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.995551 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-host-cni-netd\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.995573 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-multus-conf-dir\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.995609 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-host-run-k8s-cni-cncf-io\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:51 crc kubenswrapper[4699]: E1122 04:07:51.995614 4699 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 04:07:51 crc kubenswrapper[4699]: I1122 04:07:51.995631 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-host-var-lib-cni-multus\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:51 crc kubenswrapper[4699]: E1122 04:07:51.995663 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 04:07:53.995653065 +0000 UTC m=+25.338274342 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.004735 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.014715 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.025297 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e855881-4d77-4655-b4d7-a50fc081f993\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545a27e66130160ef1d8557458a64a27f18292c157e2e6dab9aa75aea0532ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35e9c8adb3bd9249f6d7e57cd40e40951af0463e49765ba635707120d07e8b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1538d20749062691aa2368004d22a46e612186aee24cb92acc3ddb073f616a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4a053080810e22083dda4eaba1155b7b547a214158f849f7e5778f2e37ccc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.063931 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86ztb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d15248-9724-41b0-8370-66127cc18bbe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-799vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86ztb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.096480 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7e5e536a-6797-4e6f-8160-1e23ddda1647-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b7225\" (UID: \"7e5e536a-6797-4e6f-8160-1e23ddda1647\") " pod="openshift-multus/multus-additional-cni-plugins-b7225" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.096529 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-multus-cni-dir\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.096550 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-cnibin\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.096578 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-ovnkube-script-lib\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.096603 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-os-release\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.096633 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-host-cni-netd\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.096653 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-multus-conf-dir\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.096679 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-host-run-k8s-cni-cncf-io\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.096701 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-host-var-lib-cni-multus\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.096724 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-cni-binary-copy\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.096748 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.096755 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-host-cni-netd\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.096775 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.096796 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-multus-cni-dir\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.096803 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.096849 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-etc-kubernetes\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.096877 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccx9x\" (UniqueName: \"kubernetes.io/projected/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-kube-api-access-ccx9x\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:52 crc kubenswrapper[4699]: E1122 04:07:52.096919 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.096895 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/41bdbae2-706a-4f84-9f56-5a42aec77762-rootfs\") pod \"machine-config-daemon-kjwnt\" (UID: \"41bdbae2-706a-4f84-9f56-5a42aec77762\") " pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" Nov 22 04:07:52 crc kubenswrapper[4699]: E1122 04:07:52.096951 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.096960 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e5e536a-6797-4e6f-8160-1e23ddda1647-system-cni-dir\") pod \"multus-additional-cni-plugins-b7225\" (UID: \"7e5e536a-6797-4e6f-8160-1e23ddda1647\") " pod="openshift-multus/multus-additional-cni-plugins-b7225" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.096981 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-host-slash\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.096988 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-os-release\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.097007 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-ovn-node-metrics-cert\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.097027 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-system-cni-dir\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.097048 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e5e536a-6797-4e6f-8160-1e23ddda1647-system-cni-dir\") pod \"multus-additional-cni-plugins-b7225\" (UID: \"7e5e536a-6797-4e6f-8160-1e23ddda1647\") " pod="openshift-multus/multus-additional-cni-plugins-b7225" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.097052 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7e5e536a-6797-4e6f-8160-1e23ddda1647-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b7225\" (UID: \"7e5e536a-6797-4e6f-8160-1e23ddda1647\") " pod="openshift-multus/multus-additional-cni-plugins-b7225" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.096741 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-cnibin\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.097081 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-hostroot\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.097111 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbkvv\" (UniqueName: \"kubernetes.io/projected/7e5e536a-6797-4e6f-8160-1e23ddda1647-kube-api-access-vbkvv\") pod \"multus-additional-cni-plugins-b7225\" (UID: \"7e5e536a-6797-4e6f-8160-1e23ddda1647\") " pod="openshift-multus/multus-additional-cni-plugins-b7225" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.097114 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.097142 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-ovnkube-config\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.097153 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-host-run-k8s-cni-cncf-io\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.097160 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7e5e536a-6797-4e6f-8160-1e23ddda1647-cni-binary-copy\") pod \"multus-additional-cni-plugins-b7225\" (UID: \"7e5e536a-6797-4e6f-8160-1e23ddda1647\") " pod="openshift-multus/multus-additional-cni-plugins-b7225" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.097183 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-run-systemd\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.097204 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-etc-openvswitch\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.097226 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-multus-socket-dir-parent\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.097249 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-node-log\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.097269 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-host-cni-bin\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.097288 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/41bdbae2-706a-4f84-9f56-5a42aec77762-proxy-tls\") pod \"machine-config-daemon-kjwnt\" (UID: \"41bdbae2-706a-4f84-9f56-5a42aec77762\") " pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.097313 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-host-run-netns\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.097336 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-host-run-multus-certs\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.097367 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7e5e536a-6797-4e6f-8160-1e23ddda1647-cnibin\") pod \"multus-additional-cni-plugins-b7225\" (UID: \"7e5e536a-6797-4e6f-8160-1e23ddda1647\") " pod="openshift-multus/multus-additional-cni-plugins-b7225" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.097391 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-var-lib-openvswitch\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.097412 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-multus-daemon-config\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.097457 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km2cd\" (UniqueName: \"kubernetes.io/projected/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-kube-api-access-km2cd\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.097479 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtp5d\" (UniqueName: \"kubernetes.io/projected/41bdbae2-706a-4f84-9f56-5a42aec77762-kube-api-access-dtp5d\") pod \"machine-config-daemon-kjwnt\" (UID: \"41bdbae2-706a-4f84-9f56-5a42aec77762\") " pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.097503 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-host-var-lib-cni-bin\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.097524 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-host-var-lib-kubelet\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.097537 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-ovnkube-script-lib\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.097579 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-cni-binary-copy\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.097597 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7e5e536a-6797-4e6f-8160-1e23ddda1647-os-release\") pod \"multus-additional-cni-plugins-b7225\" (UID: \"7e5e536a-6797-4e6f-8160-1e23ddda1647\") " pod="openshift-multus/multus-additional-cni-plugins-b7225" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.097546 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7e5e536a-6797-4e6f-8160-1e23ddda1647-os-release\") pod \"multus-additional-cni-plugins-b7225\" (UID: \"7e5e536a-6797-4e6f-8160-1e23ddda1647\") " pod="openshift-multus/multus-additional-cni-plugins-b7225" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.097614 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-host-slash\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:52 crc kubenswrapper[4699]: E1122 04:07:52.096965 4699 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.097644 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-multus-conf-dir\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.097651 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-host-run-multus-certs\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:52 crc kubenswrapper[4699]: E1122 04:07:52.097691 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-22 04:07:54.097675816 +0000 UTC m=+25.440297003 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:07:52 crc kubenswrapper[4699]: E1122 04:07:52.097068 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 04:07:52 crc kubenswrapper[4699]: E1122 04:07:52.097783 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 04:07:52 crc kubenswrapper[4699]: E1122 04:07:52.097794 4699 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.097800 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-hostroot\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.097834 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-run-systemd\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:52 crc kubenswrapper[4699]: E1122 04:07:52.097838 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-22 04:07:54.09782479 +0000 UTC m=+25.440446217 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.097877 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-var-lib-openvswitch\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.097868 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-host-cni-bin\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.097885 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7e5e536a-6797-4e6f-8160-1e23ddda1647-cni-binary-copy\") pod \"multus-additional-cni-plugins-b7225\" (UID: \"7e5e536a-6797-4e6f-8160-1e23ddda1647\") " pod="openshift-multus/multus-additional-cni-plugins-b7225" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.097919 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-multus-socket-dir-parent\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.097958 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7e5e536a-6797-4e6f-8160-1e23ddda1647-cnibin\") pod \"multus-additional-cni-plugins-b7225\" (UID: \"7e5e536a-6797-4e6f-8160-1e23ddda1647\") " pod="openshift-multus/multus-additional-cni-plugins-b7225" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.097980 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-host-var-lib-cni-bin\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.097993 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-node-log\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.098001 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-host-var-lib-cni-multus\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.098022 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-run-ovn\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.098030 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-system-cni-dir\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.098043 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-env-overrides\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.098037 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-etc-kubernetes\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.098072 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-etc-openvswitch\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.098083 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/41bdbae2-706a-4f84-9f56-5a42aec77762-rootfs\") pod \"machine-config-daemon-kjwnt\" (UID: \"41bdbae2-706a-4f84-9f56-5a42aec77762\") " pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.098099 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-run-ovn\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.098100 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7e5e536a-6797-4e6f-8160-1e23ddda1647-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b7225\" (UID: \"7e5e536a-6797-4e6f-8160-1e23ddda1647\") " pod="openshift-multus/multus-additional-cni-plugins-b7225" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.098109 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/41bdbae2-706a-4f84-9f56-5a42aec77762-mcd-auth-proxy-config\") pod \"machine-config-daemon-kjwnt\" (UID: \"41bdbae2-706a-4f84-9f56-5a42aec77762\") " pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.098132 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-host-var-lib-kubelet\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.098141 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-log-socket\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.098151 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-host-run-netns\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.098168 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-host-run-ovn-kubernetes\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.098184 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-log-socket\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.098244 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-host-kubelet\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.098267 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-systemd-units\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.098316 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-host-run-netns\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.098341 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-run-openvswitch\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.098355 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-host-run-ovn-kubernetes\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.098415 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-systemd-units\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.098474 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-run-openvswitch\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.098493 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-env-overrides\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.098499 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-host-run-netns\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.098500 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-host-kubelet\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.098543 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-multus-daemon-config\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.098544 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-ovnkube-config\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.098743 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/41bdbae2-706a-4f84-9f56-5a42aec77762-mcd-auth-proxy-config\") pod \"machine-config-daemon-kjwnt\" (UID: \"41bdbae2-706a-4f84-9f56-5a42aec77762\") " pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.098856 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7e5e536a-6797-4e6f-8160-1e23ddda1647-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b7225\" (UID: \"7e5e536a-6797-4e6f-8160-1e23ddda1647\") " pod="openshift-multus/multus-additional-cni-plugins-b7225" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.112224 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z7552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.176306 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/41bdbae2-706a-4f84-9f56-5a42aec77762-proxy-tls\") pod \"machine-config-daemon-kjwnt\" (UID: \"41bdbae2-706a-4f84-9f56-5a42aec77762\") " pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.176311 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbkvv\" (UniqueName: \"kubernetes.io/projected/7e5e536a-6797-4e6f-8160-1e23ddda1647-kube-api-access-vbkvv\") pod \"multus-additional-cni-plugins-b7225\" (UID: \"7e5e536a-6797-4e6f-8160-1e23ddda1647\") " pod="openshift-multus/multus-additional-cni-plugins-b7225" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.176546 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-ovn-node-metrics-cert\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.177238 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km2cd\" (UniqueName: \"kubernetes.io/projected/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-kube-api-access-km2cd\") pod \"ovnkube-node-z7552\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.178806 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtp5d\" (UniqueName: \"kubernetes.io/projected/41bdbae2-706a-4f84-9f56-5a42aec77762-kube-api-access-dtp5d\") pod \"machine-config-daemon-kjwnt\" (UID: \"41bdbae2-706a-4f84-9f56-5a42aec77762\") " pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.182657 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.200588 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccx9x\" (UniqueName: \"kubernetes.io/projected/c5f530d5-6f69-4838-a0dd-f4662ddbf85c-kube-api-access-ccx9x\") pod \"multus-pmtb4\" (UID: \"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\") " pod="openshift-multus/multus-pmtb4" Nov 22 04:07:52 crc kubenswrapper[4699]: W1122 04:07:52.213595 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa3d3ec8_1b76_4cc3_bfc0_60a9c6bc29f3.slice/crio-1bce859ec6c521dfc40466f879cabfb7816b2238d18f4fbba72fbb2cd24fa9ec WatchSource:0}: Error finding container 1bce859ec6c521dfc40466f879cabfb7816b2238d18f4fbba72fbb2cd24fa9ec: Status 404 returned error can't find the container with id 1bce859ec6c521dfc40466f879cabfb7816b2238d18f4fbba72fbb2cd24fa9ec Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.237813 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h6ndp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd066499-5bd5-459c-8a02-d02f716c8965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hhkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h6ndp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.272599 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bdbae2-706a-4f84-9f56-5a42aec77762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kjwnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.281316 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Nov 22 04:07:52 crc kubenswrapper[4699]: E1122 04:07:52.289367 4699 configmap.go:193] Couldn't get configMap openshift-image-registry/image-registry-certificates: failed to sync configmap cache: timed out waiting for the condition Nov 22 04:07:52 crc kubenswrapper[4699]: E1122 04:07:52.289695 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dd066499-5bd5-459c-8a02-d02f716c8965-serviceca podName:dd066499-5bd5-459c-8a02-d02f716c8965 nodeName:}" failed. No retries permitted until 2025-11-22 04:07:52.789676538 +0000 UTC m=+24.132297725 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serviceca" (UniqueName: "kubernetes.io/configmap/dd066499-5bd5-459c-8a02-d02f716c8965-serviceca") pod "node-ca-h6ndp" (UID: "dd066499-5bd5-459c-8a02-d02f716c8965") : failed to sync configmap cache: timed out waiting for the condition Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.319665 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.326332 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hhkm\" (UniqueName: \"kubernetes.io/projected/dd066499-5bd5-459c-8a02-d02f716c8965-kube-api-access-9hhkm\") pod \"node-ca-h6ndp\" (UID: \"dd066499-5bd5-459c-8a02-d02f716c8965\") " pod="openshift-image-registry/node-ca-h6ndp" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.347277 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b7225" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e5e536a-6797-4e6f-8160-1e23ddda1647\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b7225\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.387448 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653394-4b4d-4c44-bc9d-39f2eeadbee4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e08c778826ca87eedf7169382d30509a5d31e132f5c91ff2cf633a24e3a7dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb226d8acfbc46b2a51a6c4ef5c04c1e17d99e9e82bad5950ccb4356fcc39eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c1d8b6512002b090f6fa191cc3dc7d55aeae6d135bca5df2c367fb2a4f68c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7515e23a57d4ee6d0c28dec98dd1a2ef25aebe1071a17b5fdf9496d2deb76b8e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:07:44Z\\\",\\\"message\\\":\\\"W1122 04:07:33.335076 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 04:07:33.335386 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763784453 cert, and key in /tmp/serving-cert-3900859243/serving-signer.crt, /tmp/serving-cert-3900859243/serving-signer.key\\\\nI1122 04:07:33.880107 1 observer_polling.go:159] Starting file observer\\\\nW1122 04:07:33.882546 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 04:07:33.882716 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:07:33.883481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3900859243/tls.crt::/tmp/serving-cert-3900859243/tls.key\\\\\\\"\\\\nF1122 04:07:44.506415 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 04:07:50.127900 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 04:07:50.128059 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:07:50.128926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2923111326/tls.crt::/tmp/serving-cert-2923111326/tls.key\\\\\\\"\\\\nI1122 04:07:50.418529 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:07:50.432499 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:07:50.432593 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:07:50.432650 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:07:50.432686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:07:50.439773 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1122 04:07:50.439810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 04:07:50.439829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439834 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:07:50.439842 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:07:50.439844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:07:50.439864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:07:50.442112 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e25f8f28cc3aca76ae535aa6084bd1f994cbd0eb679f6ea40938a7fe456b0e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.427213 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.440640 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.446913 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.446958 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.447023 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:07:52 crc kubenswrapper[4699]: E1122 04:07:52.447119 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:07:52 crc kubenswrapper[4699]: E1122 04:07:52.447211 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:07:52 crc kubenswrapper[4699]: E1122 04:07:52.447288 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.461036 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.468884 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pmtb4" Nov 22 04:07:52 crc kubenswrapper[4699]: W1122 04:07:52.470762 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41bdbae2_706a_4f84_9f56_5a42aec77762.slice/crio-083ef6d18b06c8b17d1d6749e964d4ade2617656fd294bcb4c06951ab91654ef WatchSource:0}: Error finding container 083ef6d18b06c8b17d1d6749e964d4ade2617656fd294bcb4c06951ab91654ef: Status 404 returned error can't find the container with id 083ef6d18b06c8b17d1d6749e964d4ade2617656fd294bcb4c06951ab91654ef Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.475617 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-b7225" Nov 22 04:07:52 crc kubenswrapper[4699]: W1122 04:07:52.479289 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5f530d5_6f69_4838_a0dd_f4662ddbf85c.slice/crio-13564f49eb7681a79f38acf9bff8d61f70c0900e226fac82df536034e817641f WatchSource:0}: Error finding container 13564f49eb7681a79f38acf9bff8d61f70c0900e226fac82df536034e817641f: Status 404 returned error can't find the container with id 13564f49eb7681a79f38acf9bff8d61f70c0900e226fac82df536034e817641f Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.486909 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.609195 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" event={"ID":"41bdbae2-706a-4f84-9f56-5a42aec77762","Type":"ContainerStarted","Data":"191befb5ec1036276709a4720f3cd8c40d63d14818bed55c5fac998489233619"} Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.609241 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" event={"ID":"41bdbae2-706a-4f84-9f56-5a42aec77762","Type":"ContainerStarted","Data":"083ef6d18b06c8b17d1d6749e964d4ade2617656fd294bcb4c06951ab91654ef"} Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.612754 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"4bfcbb63b703f8f023d54028af9011b37da8d2f7c9ac57e35129cd783f301876"} Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.629889 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pmtb4" event={"ID":"c5f530d5-6f69-4838-a0dd-f4662ddbf85c","Type":"ContainerStarted","Data":"f5af0f83551d8cf679ee04fbc3995afe66769f74480211fb104ebf2d6d0f9ab8"} Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.629975 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pmtb4" event={"ID":"c5f530d5-6f69-4838-a0dd-f4662ddbf85c","Type":"ContainerStarted","Data":"13564f49eb7681a79f38acf9bff8d61f70c0900e226fac82df536034e817641f"} Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.633507 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653394-4b4d-4c44-bc9d-39f2eeadbee4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e08c778826ca87eedf7169382d30509a5d31e132f5c91ff2cf633a24e3a7dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb226d8acfbc46b2a51a6c4ef5c04c1e17d99e9e82bad5950ccb4356fcc39eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c1d8b6512002b090f6fa191cc3dc7d55aeae6d135bca5df2c367fb2a4f68c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7515e23a57d4ee6d0c28dec98dd1a2ef25aebe1071a17b5fdf9496d2deb76b8e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:07:44Z\\\",\\\"message\\\":\\\"W1122 04:07:33.335076 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 04:07:33.335386 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763784453 cert, and key in /tmp/serving-cert-3900859243/serving-signer.crt, /tmp/serving-cert-3900859243/serving-signer.key\\\\nI1122 04:07:33.880107 1 observer_polling.go:159] Starting file observer\\\\nW1122 04:07:33.882546 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 04:07:33.882716 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:07:33.883481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3900859243/tls.crt::/tmp/serving-cert-3900859243/tls.key\\\\\\\"\\\\nF1122 04:07:44.506415 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 04:07:50.127900 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 04:07:50.128059 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:07:50.128926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2923111326/tls.crt::/tmp/serving-cert-2923111326/tls.key\\\\\\\"\\\\nI1122 04:07:50.418529 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:07:50.432499 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:07:50.432593 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:07:50.432650 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:07:50.432686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:07:50.439773 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1122 04:07:50.439810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 04:07:50.439829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439834 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:07:50.439842 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:07:50.439844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:07:50.439864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:07:50.442112 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e25f8f28cc3aca76ae535aa6084bd1f994cbd0eb679f6ea40938a7fe456b0e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:52Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.639103 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b7225" event={"ID":"7e5e536a-6797-4e6f-8160-1e23ddda1647","Type":"ContainerStarted","Data":"5e356ecb1047e43dcb70f06b89f4931cfe5935a267bf909a735c055990a75eb0"} Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.640756 4699 generic.go:334] "Generic (PLEG): container finished" podID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerID="a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4" exitCode=0 Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.640828 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" event={"ID":"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3","Type":"ContainerDied","Data":"a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4"} Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.640867 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" event={"ID":"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3","Type":"ContainerStarted","Data":"1bce859ec6c521dfc40466f879cabfb7816b2238d18f4fbba72fbb2cd24fa9ec"} Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.643404 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.647071 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:52Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.647136 4699 scope.go:117] "RemoveContainer" containerID="a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f" Nov 22 04:07:52 crc kubenswrapper[4699]: E1122 04:07:52.647289 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.648737 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-86ztb" event={"ID":"34d15248-9724-41b0-8370-66127cc18bbe","Type":"ContainerStarted","Data":"08e180e0857112708a5ca84fc45cd41b9aebc5eef5628d5666abc590d86242e2"} Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.660078 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:52Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.671271 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h6ndp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd066499-5bd5-459c-8a02-d02f716c8965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hhkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h6ndp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:52Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.689994 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bdbae2-706a-4f84-9f56-5a42aec77762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kjwnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:52Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.734317 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b7225" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e5e536a-6797-4e6f-8160-1e23ddda1647\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b7225\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:52Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.770016 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c858c4eaa869f479d0fbd62eadd41218ca8dddc7ae5ffd82d36977acde2e76ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:52Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.805387 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/dd066499-5bd5-459c-8a02-d02f716c8965-serviceca\") pod \"node-ca-h6ndp\" (UID: \"dd066499-5bd5-459c-8a02-d02f716c8965\") " pod="openshift-image-registry/node-ca-h6ndp" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.808572 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/dd066499-5bd5-459c-8a02-d02f716c8965-serviceca\") pod \"node-ca-h6ndp\" (UID: \"dd066499-5bd5-459c-8a02-d02f716c8965\") " pod="openshift-image-registry/node-ca-h6ndp" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.810637 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:52Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.814731 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.827649 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.851178 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.854007 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-h6ndp" Nov 22 04:07:52 crc kubenswrapper[4699]: W1122 04:07:52.866306 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd066499_5bd5_459c_8a02_d02f716c8965.slice/crio-2b113f25293a45b2e91676098f185e25dd13a341472a9f3e0097d76ba739f43d WatchSource:0}: Error finding container 2b113f25293a45b2e91676098f185e25dd13a341472a9f3e0097d76ba739f43d: Status 404 returned error can't find the container with id 2b113f25293a45b2e91676098f185e25dd13a341472a9f3e0097d76ba739f43d Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.872584 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pmtb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccx9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pmtb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:52Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.911063 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:52Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.949143 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bfcbb63b703f8f023d54028af9011b37da8d2f7c9ac57e35129cd783f301876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99bfafe09aabfb9e3715d3c7af12849e0c8cb66e5799011c8463c5043383fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:52Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:52 crc kubenswrapper[4699]: I1122 04:07:52.994080 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e855881-4d77-4655-b4d7-a50fc081f993\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545a27e66130160ef1d8557458a64a27f18292c157e2e6dab9aa75aea0532ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35e9c8adb3bd9249f6d7e57cd40e40951af0463e49765ba635707120d07e8b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1538d20749062691aa2368004d22a46e612186aee24cb92acc3ddb073f616a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4a053080810e22083dda4eaba1155b7b547a214158f849f7e5778f2e37ccc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:52Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:53 crc kubenswrapper[4699]: I1122 04:07:53.028225 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86ztb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d15248-9724-41b0-8370-66127cc18bbe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-799vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86ztb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:53Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:53 crc kubenswrapper[4699]: I1122 04:07:53.079674 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z7552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:53Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:53 crc kubenswrapper[4699]: I1122 04:07:53.113153 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:53Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:53 crc kubenswrapper[4699]: I1122 04:07:53.148972 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bfcbb63b703f8f023d54028af9011b37da8d2f7c9ac57e35129cd783f301876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99bfafe09aabfb9e3715d3c7af12849e0c8cb66e5799011c8463c5043383fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:53Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:53 crc kubenswrapper[4699]: I1122 04:07:53.189244 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e855881-4d77-4655-b4d7-a50fc081f993\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545a27e66130160ef1d8557458a64a27f18292c157e2e6dab9aa75aea0532ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35e9c8adb3bd9249f6d7e57cd40e40951af0463e49765ba635707120d07e8b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1538d20749062691aa2368004d22a46e612186aee24cb92acc3ddb073f616a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4a053080810e22083dda4eaba1155b7b547a214158f849f7e5778f2e37ccc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:53Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:53 crc kubenswrapper[4699]: I1122 04:07:53.235295 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b193b41e-aa0e-4816-b965-7b7873dadf85\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd4757f265f2b7a453efca645d83d5340e5ec206f6f9d40dd86010b90470498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1996517d6f55ae1765dd9d101fede2963e7ac51a406bca35cab95fa45192623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59408c7cd75594e068cdc4dadfec414fcc3d1604eea37ed708440fd1a4f019ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://516e9231111cee4a53c71bef07338222497c8ffb27edbfaddbcb2e58af61ae7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2097cbd81d5aedb02fafaae3f17840da75ab455e541c410ae2f70710548530ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:53Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:53 crc kubenswrapper[4699]: I1122 04:07:53.269251 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86ztb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d15248-9724-41b0-8370-66127cc18bbe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08e180e0857112708a5ca84fc45cd41b9aebc5eef5628d5666abc590d86242e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-799vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86ztb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:53Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:53 crc kubenswrapper[4699]: I1122 04:07:53.313359 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z7552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:53Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:53 crc kubenswrapper[4699]: I1122 04:07:53.355124 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653394-4b4d-4c44-bc9d-39f2eeadbee4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e08c778826ca87eedf7169382d30509a5d31e132f5c91ff2cf633a24e3a7dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb226d8acfbc46b2a51a6c4ef5c04c1e17d99e9e82bad5950ccb4356fcc39eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c1d8b6512002b090f6fa191cc3dc7d55aeae6d135bca5df2c367fb2a4f68c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 04:07:50.127900 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 04:07:50.128059 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:07:50.128926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2923111326/tls.crt::/tmp/serving-cert-2923111326/tls.key\\\\\\\"\\\\nI1122 04:07:50.418529 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:07:50.432499 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:07:50.432593 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:07:50.432650 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:07:50.432686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:07:50.439773 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1122 04:07:50.439810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 04:07:50.439829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439834 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:07:50.439842 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:07:50.439844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:07:50.439864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:07:50.442112 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e25f8f28cc3aca76ae535aa6084bd1f994cbd0eb679f6ea40938a7fe456b0e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:53Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:53 crc kubenswrapper[4699]: I1122 04:07:53.390253 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:53Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:53 crc kubenswrapper[4699]: I1122 04:07:53.429480 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:53Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:53 crc kubenswrapper[4699]: I1122 04:07:53.470322 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h6ndp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd066499-5bd5-459c-8a02-d02f716c8965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hhkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h6ndp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:53Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:53 crc kubenswrapper[4699]: I1122 04:07:53.508784 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bdbae2-706a-4f84-9f56-5a42aec77762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kjwnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:53Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:53 crc kubenswrapper[4699]: I1122 04:07:53.549286 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b7225" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e5e536a-6797-4e6f-8160-1e23ddda1647\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b7225\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:53Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:53 crc kubenswrapper[4699]: I1122 04:07:53.589683 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c858c4eaa869f479d0fbd62eadd41218ca8dddc7ae5ffd82d36977acde2e76ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:53Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:53 crc kubenswrapper[4699]: I1122 04:07:53.631231 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:53Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:53 crc kubenswrapper[4699]: I1122 04:07:53.659033 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" event={"ID":"41bdbae2-706a-4f84-9f56-5a42aec77762","Type":"ContainerStarted","Data":"cc56d58ec38fe2e6ff34afa44193fd165159799c6184b7f1474c8b13087f257f"} Nov 22 04:07:53 crc kubenswrapper[4699]: I1122 04:07:53.661214 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-h6ndp" event={"ID":"dd066499-5bd5-459c-8a02-d02f716c8965","Type":"ContainerStarted","Data":"9822e0ef5b78e9c1b19b56d52c7eed8ad0058cc30b405b2adf0e2a572afdaab5"} Nov 22 04:07:53 crc kubenswrapper[4699]: I1122 04:07:53.661254 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-h6ndp" event={"ID":"dd066499-5bd5-459c-8a02-d02f716c8965","Type":"ContainerStarted","Data":"2b113f25293a45b2e91676098f185e25dd13a341472a9f3e0097d76ba739f43d"} Nov 22 04:07:53 crc kubenswrapper[4699]: I1122 04:07:53.663184 4699 generic.go:334] "Generic (PLEG): container finished" podID="7e5e536a-6797-4e6f-8160-1e23ddda1647" containerID="8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6" exitCode=0 Nov 22 04:07:53 crc kubenswrapper[4699]: I1122 04:07:53.663244 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b7225" event={"ID":"7e5e536a-6797-4e6f-8160-1e23ddda1647","Type":"ContainerDied","Data":"8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6"} Nov 22 04:07:53 crc kubenswrapper[4699]: I1122 04:07:53.669701 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" event={"ID":"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3","Type":"ContainerStarted","Data":"1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a"} Nov 22 04:07:53 crc kubenswrapper[4699]: I1122 04:07:53.669752 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" event={"ID":"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3","Type":"ContainerStarted","Data":"85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5"} Nov 22 04:07:53 crc kubenswrapper[4699]: I1122 04:07:53.669766 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" event={"ID":"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3","Type":"ContainerStarted","Data":"ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e"} Nov 22 04:07:53 crc kubenswrapper[4699]: I1122 04:07:53.669791 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" event={"ID":"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3","Type":"ContainerStarted","Data":"823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d"} Nov 22 04:07:53 crc kubenswrapper[4699]: I1122 04:07:53.669802 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" event={"ID":"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3","Type":"ContainerStarted","Data":"e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612"} Nov 22 04:07:53 crc kubenswrapper[4699]: I1122 04:07:53.669812 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" event={"ID":"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3","Type":"ContainerStarted","Data":"df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710"} Nov 22 04:07:53 crc kubenswrapper[4699]: I1122 04:07:53.671916 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"09c43ee45b5065b7baee9b0025b5a73b4915b4577169a35be4378acf0e7cb603"} Nov 22 04:07:53 crc kubenswrapper[4699]: I1122 04:07:53.672498 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pmtb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5af0f83551d8cf679ee04fbc3995afe66769f74480211fb104ebf2d6d0f9ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccx9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pmtb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:53Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:53 crc kubenswrapper[4699]: E1122 04:07:53.703905 4699 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Nov 22 04:07:53 crc kubenswrapper[4699]: I1122 04:07:53.761448 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z7552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:53Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:53 crc kubenswrapper[4699]: I1122 04:07:53.787132 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e855881-4d77-4655-b4d7-a50fc081f993\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545a27e66130160ef1d8557458a64a27f18292c157e2e6dab9aa75aea0532ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35e9c8adb3bd9249f6d7e57cd40e40951af0463e49765ba635707120d07e8b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1538d20749062691aa2368004d22a46e612186aee24cb92acc3ddb073f616a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4a053080810e22083dda4eaba1155b7b547a214158f849f7e5778f2e37ccc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:53Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:53 crc kubenswrapper[4699]: I1122 04:07:53.813934 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b193b41e-aa0e-4816-b965-7b7873dadf85\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd4757f265f2b7a453efca645d83d5340e5ec206f6f9d40dd86010b90470498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1996517d6f55ae1765dd9d101fede2963e7ac51a406bca35cab95fa45192623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59408c7cd75594e068cdc4dadfec414fcc3d1604eea37ed708440fd1a4f019ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://516e9231111cee4a53c71bef07338222497c8ffb27edbfaddbcb2e58af61ae7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2097cbd81d5aedb02fafaae3f17840da75ab455e541c410ae2f70710548530ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:53Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:53 crc kubenswrapper[4699]: I1122 04:07:53.849353 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86ztb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d15248-9724-41b0-8370-66127cc18bbe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08e180e0857112708a5ca84fc45cd41b9aebc5eef5628d5666abc590d86242e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-799vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86ztb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:53Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:53 crc kubenswrapper[4699]: I1122 04:07:53.888678 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c43ee45b5065b7baee9b0025b5a73b4915b4577169a35be4378acf0e7cb603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:53Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:53 crc kubenswrapper[4699]: I1122 04:07:53.929043 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h6ndp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd066499-5bd5-459c-8a02-d02f716c8965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822e0ef5b78e9c1b19b56d52c7eed8ad0058cc30b405b2adf0e2a572afdaab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hhkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h6ndp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:53Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:53 crc kubenswrapper[4699]: I1122 04:07:53.968766 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bdbae2-706a-4f84-9f56-5a42aec77762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc56d58ec38fe2e6ff34afa44193fd165159799c6184b7f1474c8b13087f257f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191befb5ec1036276709a4720f3cd8c40d63d14818bed55c5fac998489233619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kjwnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:53Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:54 crc kubenswrapper[4699]: I1122 04:07:54.011205 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b7225" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e5e536a-6797-4e6f-8160-1e23ddda1647\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b7225\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:54Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:54 crc kubenswrapper[4699]: I1122 04:07:54.017584 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:07:54 crc kubenswrapper[4699]: E1122 04:07:54.017752 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:07:58.017725852 +0000 UTC m=+29.360347049 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:07:54 crc kubenswrapper[4699]: I1122 04:07:54.017924 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:07:54 crc kubenswrapper[4699]: E1122 04:07:54.018059 4699 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 04:07:54 crc kubenswrapper[4699]: E1122 04:07:54.018109 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 04:07:58.018098351 +0000 UTC m=+29.360719538 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 04:07:54 crc kubenswrapper[4699]: E1122 04:07:54.018225 4699 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 04:07:54 crc kubenswrapper[4699]: E1122 04:07:54.018286 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 04:07:58.018274435 +0000 UTC m=+29.360895622 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 04:07:54 crc kubenswrapper[4699]: I1122 04:07:54.018068 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:07:54 crc kubenswrapper[4699]: I1122 04:07:54.051198 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653394-4b4d-4c44-bc9d-39f2eeadbee4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e08c778826ca87eedf7169382d30509a5d31e132f5c91ff2cf633a24e3a7dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb226d8acfbc46b2a51a6c4ef5c04c1e17d99e9e82bad5950ccb4356fcc39eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c1d8b6512002b090f6fa191cc3dc7d55aeae6d135bca5df2c367fb2a4f68c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 04:07:50.127900 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 04:07:50.128059 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:07:50.128926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2923111326/tls.crt::/tmp/serving-cert-2923111326/tls.key\\\\\\\"\\\\nI1122 04:07:50.418529 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:07:50.432499 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:07:50.432593 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:07:50.432650 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:07:50.432686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:07:50.439773 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1122 04:07:50.439810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 04:07:50.439829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439834 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:07:50.439842 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:07:50.439844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:07:50.439864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:07:50.442112 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e25f8f28cc3aca76ae535aa6084bd1f994cbd0eb679f6ea40938a7fe456b0e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:54Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:54 crc kubenswrapper[4699]: I1122 04:07:54.090248 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:54Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:54 crc kubenswrapper[4699]: I1122 04:07:54.118694 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:07:54 crc kubenswrapper[4699]: I1122 04:07:54.118740 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:07:54 crc kubenswrapper[4699]: E1122 04:07:54.118874 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 04:07:54 crc kubenswrapper[4699]: E1122 04:07:54.118892 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 04:07:54 crc kubenswrapper[4699]: E1122 04:07:54.118904 4699 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:07:54 crc kubenswrapper[4699]: E1122 04:07:54.118945 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-22 04:07:58.118931373 +0000 UTC m=+29.461552560 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:07:54 crc kubenswrapper[4699]: E1122 04:07:54.119251 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 04:07:54 crc kubenswrapper[4699]: E1122 04:07:54.119346 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 04:07:54 crc kubenswrapper[4699]: E1122 04:07:54.119428 4699 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:07:54 crc kubenswrapper[4699]: E1122 04:07:54.119569 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-22 04:07:58.119554149 +0000 UTC m=+29.462175336 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:07:54 crc kubenswrapper[4699]: I1122 04:07:54.129388 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pmtb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5af0f83551d8cf679ee04fbc3995afe66769f74480211fb104ebf2d6d0f9ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccx9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pmtb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:54Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:54 crc kubenswrapper[4699]: I1122 04:07:54.171150 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c858c4eaa869f479d0fbd62eadd41218ca8dddc7ae5ffd82d36977acde2e76ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:54Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:54 crc kubenswrapper[4699]: I1122 04:07:54.216371 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:54Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:54 crc kubenswrapper[4699]: I1122 04:07:54.253131 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:54Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:54 crc kubenswrapper[4699]: I1122 04:07:54.293665 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bfcbb63b703f8f023d54028af9011b37da8d2f7c9ac57e35129cd783f301876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99bfafe09aabfb9e3715d3c7af12849e0c8cb66e5799011c8463c5043383fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:54Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:54 crc kubenswrapper[4699]: I1122 04:07:54.447225 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:07:54 crc kubenswrapper[4699]: I1122 04:07:54.447578 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:07:54 crc kubenswrapper[4699]: E1122 04:07:54.448080 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:07:54 crc kubenswrapper[4699]: E1122 04:07:54.447871 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:07:54 crc kubenswrapper[4699]: I1122 04:07:54.447601 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:07:54 crc kubenswrapper[4699]: E1122 04:07:54.448263 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:07:54 crc kubenswrapper[4699]: I1122 04:07:54.682240 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b7225" event={"ID":"7e5e536a-6797-4e6f-8160-1e23ddda1647","Type":"ContainerStarted","Data":"7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64"} Nov 22 04:07:54 crc kubenswrapper[4699]: I1122 04:07:54.698977 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:54Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:54 crc kubenswrapper[4699]: I1122 04:07:54.712951 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bfcbb63b703f8f023d54028af9011b37da8d2f7c9ac57e35129cd783f301876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99bfafe09aabfb9e3715d3c7af12849e0c8cb66e5799011c8463c5043383fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:54Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:54 crc kubenswrapper[4699]: I1122 04:07:54.727795 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e855881-4d77-4655-b4d7-a50fc081f993\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545a27e66130160ef1d8557458a64a27f18292c157e2e6dab9aa75aea0532ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35e9c8adb3bd9249f6d7e57cd40e40951af0463e49765ba635707120d07e8b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1538d20749062691aa2368004d22a46e612186aee24cb92acc3ddb073f616a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4a053080810e22083dda4eaba1155b7b547a214158f849f7e5778f2e37ccc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:54Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:54 crc kubenswrapper[4699]: I1122 04:07:54.753064 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b193b41e-aa0e-4816-b965-7b7873dadf85\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd4757f265f2b7a453efca645d83d5340e5ec206f6f9d40dd86010b90470498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1996517d6f55ae1765dd9d101fede2963e7ac51a406bca35cab95fa45192623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59408c7cd75594e068cdc4dadfec414fcc3d1604eea37ed708440fd1a4f019ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://516e9231111cee4a53c71bef07338222497c8ffb27edbfaddbcb2e58af61ae7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2097cbd81d5aedb02fafaae3f17840da75ab455e541c410ae2f70710548530ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:54Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:54 crc kubenswrapper[4699]: I1122 04:07:54.765678 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86ztb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d15248-9724-41b0-8370-66127cc18bbe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08e180e0857112708a5ca84fc45cd41b9aebc5eef5628d5666abc590d86242e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-799vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86ztb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:54Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:54 crc kubenswrapper[4699]: I1122 04:07:54.793629 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z7552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:54Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:54 crc kubenswrapper[4699]: I1122 04:07:54.809880 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bdbae2-706a-4f84-9f56-5a42aec77762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc56d58ec38fe2e6ff34afa44193fd165159799c6184b7f1474c8b13087f257f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191befb5ec1036276709a4720f3cd8c40d63d14818bed55c5fac998489233619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kjwnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:54Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:54 crc kubenswrapper[4699]: I1122 04:07:54.829258 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b7225" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e5e536a-6797-4e6f-8160-1e23ddda1647\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b7225\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:54Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:54 crc kubenswrapper[4699]: I1122 04:07:54.844511 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653394-4b4d-4c44-bc9d-39f2eeadbee4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e08c778826ca87eedf7169382d30509a5d31e132f5c91ff2cf633a24e3a7dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb226d8acfbc46b2a51a6c4ef5c04c1e17d99e9e82bad5950ccb4356fcc39eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c1d8b6512002b090f6fa191cc3dc7d55aeae6d135bca5df2c367fb2a4f68c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 04:07:50.127900 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 04:07:50.128059 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:07:50.128926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2923111326/tls.crt::/tmp/serving-cert-2923111326/tls.key\\\\\\\"\\\\nI1122 04:07:50.418529 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:07:50.432499 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:07:50.432593 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:07:50.432650 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:07:50.432686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:07:50.439773 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1122 04:07:50.439810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 04:07:50.439829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439834 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:07:50.439842 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:07:50.439844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:07:50.439864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:07:50.442112 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e25f8f28cc3aca76ae535aa6084bd1f994cbd0eb679f6ea40938a7fe456b0e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:54Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:54 crc kubenswrapper[4699]: I1122 04:07:54.859840 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:54Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:54 crc kubenswrapper[4699]: I1122 04:07:54.875572 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c43ee45b5065b7baee9b0025b5a73b4915b4577169a35be4378acf0e7cb603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:54Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:54 crc kubenswrapper[4699]: I1122 04:07:54.888687 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h6ndp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd066499-5bd5-459c-8a02-d02f716c8965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822e0ef5b78e9c1b19b56d52c7eed8ad0058cc30b405b2adf0e2a572afdaab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hhkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h6ndp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:54Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:54 crc kubenswrapper[4699]: I1122 04:07:54.901675 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c858c4eaa869f479d0fbd62eadd41218ca8dddc7ae5ffd82d36977acde2e76ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:54Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:54 crc kubenswrapper[4699]: I1122 04:07:54.915200 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:54Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:54 crc kubenswrapper[4699]: I1122 04:07:54.928602 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pmtb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5af0f83551d8cf679ee04fbc3995afe66769f74480211fb104ebf2d6d0f9ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccx9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pmtb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:54Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:55 crc kubenswrapper[4699]: I1122 04:07:55.687287 4699 generic.go:334] "Generic (PLEG): container finished" podID="7e5e536a-6797-4e6f-8160-1e23ddda1647" containerID="7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64" exitCode=0 Nov 22 04:07:55 crc kubenswrapper[4699]: I1122 04:07:55.687346 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b7225" event={"ID":"7e5e536a-6797-4e6f-8160-1e23ddda1647","Type":"ContainerDied","Data":"7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64"} Nov 22 04:07:55 crc kubenswrapper[4699]: I1122 04:07:55.700719 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" event={"ID":"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3","Type":"ContainerStarted","Data":"c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893"} Nov 22 04:07:55 crc kubenswrapper[4699]: I1122 04:07:55.704810 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e855881-4d77-4655-b4d7-a50fc081f993\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545a27e66130160ef1d8557458a64a27f18292c157e2e6dab9aa75aea0532ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35e9c8adb3bd9249f6d7e57cd40e40951af0463e49765ba635707120d07e8b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1538d20749062691aa2368004d22a46e612186aee24cb92acc3ddb073f616a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4a053080810e22083dda4eaba1155b7b547a214158f849f7e5778f2e37ccc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:55Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:55 crc kubenswrapper[4699]: I1122 04:07:55.731886 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b193b41e-aa0e-4816-b965-7b7873dadf85\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd4757f265f2b7a453efca645d83d5340e5ec206f6f9d40dd86010b90470498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1996517d6f55ae1765dd9d101fede2963e7ac51a406bca35cab95fa45192623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59408c7cd75594e068cdc4dadfec414fcc3d1604eea37ed708440fd1a4f019ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://516e9231111cee4a53c71bef07338222497c8ffb27edbfaddbcb2e58af61ae7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2097cbd81d5aedb02fafaae3f17840da75ab455e541c410ae2f70710548530ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:55Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:55 crc kubenswrapper[4699]: I1122 04:07:55.746153 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86ztb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d15248-9724-41b0-8370-66127cc18bbe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08e180e0857112708a5ca84fc45cd41b9aebc5eef5628d5666abc590d86242e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-799vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86ztb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:55Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:55 crc kubenswrapper[4699]: I1122 04:07:55.767605 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z7552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:55Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:55 crc kubenswrapper[4699]: I1122 04:07:55.781890 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653394-4b4d-4c44-bc9d-39f2eeadbee4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e08c778826ca87eedf7169382d30509a5d31e132f5c91ff2cf633a24e3a7dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb226d8acfbc46b2a51a6c4ef5c04c1e17d99e9e82bad5950ccb4356fcc39eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c1d8b6512002b090f6fa191cc3dc7d55aeae6d135bca5df2c367fb2a4f68c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 04:07:50.127900 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 04:07:50.128059 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:07:50.128926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2923111326/tls.crt::/tmp/serving-cert-2923111326/tls.key\\\\\\\"\\\\nI1122 04:07:50.418529 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:07:50.432499 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:07:50.432593 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:07:50.432650 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:07:50.432686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:07:50.439773 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1122 04:07:50.439810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 04:07:50.439829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439834 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:07:50.439842 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:07:50.439844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:07:50.439864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:07:50.442112 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e25f8f28cc3aca76ae535aa6084bd1f994cbd0eb679f6ea40938a7fe456b0e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:55Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:55 crc kubenswrapper[4699]: I1122 04:07:55.796058 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:55Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:55 crc kubenswrapper[4699]: I1122 04:07:55.807238 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c43ee45b5065b7baee9b0025b5a73b4915b4577169a35be4378acf0e7cb603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:55Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:55 crc kubenswrapper[4699]: I1122 04:07:55.816948 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h6ndp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd066499-5bd5-459c-8a02-d02f716c8965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822e0ef5b78e9c1b19b56d52c7eed8ad0058cc30b405b2adf0e2a572afdaab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hhkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h6ndp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:55Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:55 crc kubenswrapper[4699]: I1122 04:07:55.830552 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bdbae2-706a-4f84-9f56-5a42aec77762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc56d58ec38fe2e6ff34afa44193fd165159799c6184b7f1474c8b13087f257f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191befb5ec1036276709a4720f3cd8c40d63d14818bed55c5fac998489233619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kjwnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:55Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:55 crc kubenswrapper[4699]: I1122 04:07:55.845569 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b7225" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e5e536a-6797-4e6f-8160-1e23ddda1647\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b7225\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:55Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:55 crc kubenswrapper[4699]: I1122 04:07:55.856593 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c858c4eaa869f479d0fbd62eadd41218ca8dddc7ae5ffd82d36977acde2e76ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:55Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:55 crc kubenswrapper[4699]: I1122 04:07:55.868791 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:55Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:55 crc kubenswrapper[4699]: I1122 04:07:55.882294 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pmtb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5af0f83551d8cf679ee04fbc3995afe66769f74480211fb104ebf2d6d0f9ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccx9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pmtb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:55Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:55 crc kubenswrapper[4699]: I1122 04:07:55.894546 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:55Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:55 crc kubenswrapper[4699]: I1122 04:07:55.907981 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bfcbb63b703f8f023d54028af9011b37da8d2f7c9ac57e35129cd783f301876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99bfafe09aabfb9e3715d3c7af12849e0c8cb66e5799011c8463c5043383fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:55Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.033723 4699 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.036629 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.036669 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.036678 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.036743 4699 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.044565 4699 kubelet_node_status.go:115] "Node was previously registered" node="crc" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.044893 4699 kubelet_node_status.go:79] "Successfully registered node" node="crc" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.046099 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.046133 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.046147 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.046165 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.046177 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:07:56Z","lastTransitionTime":"2025-11-22T04:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:07:56 crc kubenswrapper[4699]: E1122 04:07:56.063940 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4852b328-c4f8-4280-9881-83927c94bf9a\\\",\\\"systemUUID\\\":\\\"76c96961-7d99-459e-9731-5ae805318244\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:56Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.069803 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.069850 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.069867 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.069892 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.069909 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:07:56Z","lastTransitionTime":"2025-11-22T04:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:07:56 crc kubenswrapper[4699]: E1122 04:07:56.083421 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4852b328-c4f8-4280-9881-83927c94bf9a\\\",\\\"systemUUID\\\":\\\"76c96961-7d99-459e-9731-5ae805318244\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:56Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.086667 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.086694 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.086704 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.086718 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.086726 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:07:56Z","lastTransitionTime":"2025-11-22T04:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:07:56 crc kubenswrapper[4699]: E1122 04:07:56.097491 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4852b328-c4f8-4280-9881-83927c94bf9a\\\",\\\"systemUUID\\\":\\\"76c96961-7d99-459e-9731-5ae805318244\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:56Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.101035 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.101065 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.101073 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.101088 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.101096 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:07:56Z","lastTransitionTime":"2025-11-22T04:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:07:56 crc kubenswrapper[4699]: E1122 04:07:56.113358 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4852b328-c4f8-4280-9881-83927c94bf9a\\\",\\\"systemUUID\\\":\\\"76c96961-7d99-459e-9731-5ae805318244\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:56Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.117005 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.117027 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.117036 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.117049 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.117058 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:07:56Z","lastTransitionTime":"2025-11-22T04:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:07:56 crc kubenswrapper[4699]: E1122 04:07:56.128453 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:07:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4852b328-c4f8-4280-9881-83927c94bf9a\\\",\\\"systemUUID\\\":\\\"76c96961-7d99-459e-9731-5ae805318244\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:56Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:56 crc kubenswrapper[4699]: E1122 04:07:56.128589 4699 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.130346 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.130385 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.130395 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.130412 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.130424 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:07:56Z","lastTransitionTime":"2025-11-22T04:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.232380 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.232412 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.232420 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.232444 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.232453 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:07:56Z","lastTransitionTime":"2025-11-22T04:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.335586 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.335636 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.335654 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.335684 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.335708 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:07:56Z","lastTransitionTime":"2025-11-22T04:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.439532 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.439598 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.439620 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.439654 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.439678 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:07:56Z","lastTransitionTime":"2025-11-22T04:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.447572 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:07:56 crc kubenswrapper[4699]: E1122 04:07:56.447747 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.447905 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:07:56 crc kubenswrapper[4699]: E1122 04:07:56.448030 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.448122 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:07:56 crc kubenswrapper[4699]: E1122 04:07:56.448204 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.542294 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.542350 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.542370 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.542396 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.542414 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:07:56Z","lastTransitionTime":"2025-11-22T04:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.644748 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.644777 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.644785 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.644797 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.644805 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:07:56Z","lastTransitionTime":"2025-11-22T04:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.706254 4699 generic.go:334] "Generic (PLEG): container finished" podID="7e5e536a-6797-4e6f-8160-1e23ddda1647" containerID="df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6" exitCode=0 Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.706318 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b7225" event={"ID":"7e5e536a-6797-4e6f-8160-1e23ddda1647","Type":"ContainerDied","Data":"df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6"} Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.738985 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bfcbb63b703f8f023d54028af9011b37da8d2f7c9ac57e35129cd783f301876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99bfafe09aabfb9e3715d3c7af12849e0c8cb66e5799011c8463c5043383fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:56Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.753330 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.753371 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.753382 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.753399 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.753411 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:07:56Z","lastTransitionTime":"2025-11-22T04:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.760841 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:56Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.786943 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b193b41e-aa0e-4816-b965-7b7873dadf85\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd4757f265f2b7a453efca645d83d5340e5ec206f6f9d40dd86010b90470498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1996517d6f55ae1765dd9d101fede2963e7ac51a406bca35cab95fa45192623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59408c7cd75594e068cdc4dadfec414fcc3d1604eea37ed708440fd1a4f019ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://516e9231111cee4a53c71bef07338222497c8ffb27edbfaddbcb2e58af61ae7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2097cbd81d5aedb02fafaae3f17840da75ab455e541c410ae2f70710548530ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:56Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.796701 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86ztb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d15248-9724-41b0-8370-66127cc18bbe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08e180e0857112708a5ca84fc45cd41b9aebc5eef5628d5666abc590d86242e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-799vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86ztb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:56Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.813694 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z7552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:56Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.826719 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e855881-4d77-4655-b4d7-a50fc081f993\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545a27e66130160ef1d8557458a64a27f18292c157e2e6dab9aa75aea0532ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35e9c8adb3bd9249f6d7e57cd40e40951af0463e49765ba635707120d07e8b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1538d20749062691aa2368004d22a46e612186aee24cb92acc3ddb073f616a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4a053080810e22083dda4eaba1155b7b547a214158f849f7e5778f2e37ccc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:56Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.842073 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:56Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.853643 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c43ee45b5065b7baee9b0025b5a73b4915b4577169a35be4378acf0e7cb603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:56Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.855598 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.855628 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.855640 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.855658 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.855669 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:07:56Z","lastTransitionTime":"2025-11-22T04:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.866430 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h6ndp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd066499-5bd5-459c-8a02-d02f716c8965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822e0ef5b78e9c1b19b56d52c7eed8ad0058cc30b405b2adf0e2a572afdaab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hhkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h6ndp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:56Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.877676 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bdbae2-706a-4f84-9f56-5a42aec77762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc56d58ec38fe2e6ff34afa44193fd165159799c6184b7f1474c8b13087f257f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191befb5ec1036276709a4720f3cd8c40d63d14818bed55c5fac998489233619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kjwnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:56Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.894143 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b7225" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e5e536a-6797-4e6f-8160-1e23ddda1647\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b7225\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:56Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.907784 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653394-4b4d-4c44-bc9d-39f2eeadbee4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e08c778826ca87eedf7169382d30509a5d31e132f5c91ff2cf633a24e3a7dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb226d8acfbc46b2a51a6c4ef5c04c1e17d99e9e82bad5950ccb4356fcc39eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c1d8b6512002b090f6fa191cc3dc7d55aeae6d135bca5df2c367fb2a4f68c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 04:07:50.127900 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 04:07:50.128059 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:07:50.128926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2923111326/tls.crt::/tmp/serving-cert-2923111326/tls.key\\\\\\\"\\\\nI1122 04:07:50.418529 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:07:50.432499 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:07:50.432593 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:07:50.432650 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:07:50.432686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:07:50.439773 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1122 04:07:50.439810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 04:07:50.439829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439834 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:07:50.439842 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:07:50.439844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:07:50.439864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:07:50.442112 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e25f8f28cc3aca76ae535aa6084bd1f994cbd0eb679f6ea40938a7fe456b0e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:56Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.919536 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c858c4eaa869f479d0fbd62eadd41218ca8dddc7ae5ffd82d36977acde2e76ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:56Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.930524 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:56Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.943037 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pmtb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5af0f83551d8cf679ee04fbc3995afe66769f74480211fb104ebf2d6d0f9ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccx9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pmtb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:56Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.958062 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.958102 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.958111 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.958125 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:07:56 crc kubenswrapper[4699]: I1122 04:07:56.958135 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:07:56Z","lastTransitionTime":"2025-11-22T04:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.060415 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.060507 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.060530 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.060563 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.060586 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:07:57Z","lastTransitionTime":"2025-11-22T04:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.164098 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.164179 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.164197 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.164227 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.164247 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:07:57Z","lastTransitionTime":"2025-11-22T04:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.267322 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.267607 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.267738 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.267836 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.267940 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:07:57Z","lastTransitionTime":"2025-11-22T04:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.370215 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.370254 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.370264 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.370279 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.370287 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:07:57Z","lastTransitionTime":"2025-11-22T04:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.472894 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.472981 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.473014 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.473052 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.473076 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:07:57Z","lastTransitionTime":"2025-11-22T04:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.576033 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.576373 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.576612 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.576715 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.576791 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:07:57Z","lastTransitionTime":"2025-11-22T04:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.686832 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.686914 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.686947 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.686973 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.686991 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:07:57Z","lastTransitionTime":"2025-11-22T04:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.718184 4699 generic.go:334] "Generic (PLEG): container finished" podID="7e5e536a-6797-4e6f-8160-1e23ddda1647" containerID="c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7" exitCode=0 Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.718254 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b7225" event={"ID":"7e5e536a-6797-4e6f-8160-1e23ddda1647","Type":"ContainerDied","Data":"c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7"} Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.735745 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:57Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.749105 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c43ee45b5065b7baee9b0025b5a73b4915b4577169a35be4378acf0e7cb603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:57Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.764290 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h6ndp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd066499-5bd5-459c-8a02-d02f716c8965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822e0ef5b78e9c1b19b56d52c7eed8ad0058cc30b405b2adf0e2a572afdaab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hhkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h6ndp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:57Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.781191 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bdbae2-706a-4f84-9f56-5a42aec77762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc56d58ec38fe2e6ff34afa44193fd165159799c6184b7f1474c8b13087f257f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191befb5ec1036276709a4720f3cd8c40d63d14818bed55c5fac998489233619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kjwnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:57Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.789370 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.789617 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.789843 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.790056 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.790233 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:07:57Z","lastTransitionTime":"2025-11-22T04:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.808248 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b7225" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e5e536a-6797-4e6f-8160-1e23ddda1647\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b7225\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:57Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.832182 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653394-4b4d-4c44-bc9d-39f2eeadbee4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e08c778826ca87eedf7169382d30509a5d31e132f5c91ff2cf633a24e3a7dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb226d8acfbc46b2a51a6c4ef5c04c1e17d99e9e82bad5950ccb4356fcc39eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c1d8b6512002b090f6fa191cc3dc7d55aeae6d135bca5df2c367fb2a4f68c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 04:07:50.127900 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 04:07:50.128059 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:07:50.128926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2923111326/tls.crt::/tmp/serving-cert-2923111326/tls.key\\\\\\\"\\\\nI1122 04:07:50.418529 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:07:50.432499 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:07:50.432593 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:07:50.432650 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:07:50.432686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:07:50.439773 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1122 04:07:50.439810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 04:07:50.439829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439834 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:07:50.439842 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:07:50.439844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:07:50.439864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:07:50.442112 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e25f8f28cc3aca76ae535aa6084bd1f994cbd0eb679f6ea40938a7fe456b0e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:57Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.851576 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:57Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.870386 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pmtb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5af0f83551d8cf679ee04fbc3995afe66769f74480211fb104ebf2d6d0f9ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccx9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pmtb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:57Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.886567 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c858c4eaa869f479d0fbd62eadd41218ca8dddc7ae5ffd82d36977acde2e76ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:57Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.893526 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.893596 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.893610 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.893636 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.893652 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:07:57Z","lastTransitionTime":"2025-11-22T04:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.906009 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:57Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.930204 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bfcbb63b703f8f023d54028af9011b37da8d2f7c9ac57e35129cd783f301876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99bfafe09aabfb9e3715d3c7af12849e0c8cb66e5799011c8463c5043383fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:57Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.944332 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86ztb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d15248-9724-41b0-8370-66127cc18bbe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08e180e0857112708a5ca84fc45cd41b9aebc5eef5628d5666abc590d86242e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-799vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86ztb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:57Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.970552 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z7552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:57Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.983877 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e855881-4d77-4655-b4d7-a50fc081f993\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545a27e66130160ef1d8557458a64a27f18292c157e2e6dab9aa75aea0532ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35e9c8adb3bd9249f6d7e57cd40e40951af0463e49765ba635707120d07e8b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1538d20749062691aa2368004d22a46e612186aee24cb92acc3ddb073f616a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4a053080810e22083dda4eaba1155b7b547a214158f849f7e5778f2e37ccc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:57Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.997662 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.997742 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.997767 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.997800 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:07:57 crc kubenswrapper[4699]: I1122 04:07:57.997824 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:07:57Z","lastTransitionTime":"2025-11-22T04:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.034590 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b193b41e-aa0e-4816-b965-7b7873dadf85\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd4757f265f2b7a453efca645d83d5340e5ec206f6f9d40dd86010b90470498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1996517d6f55ae1765dd9d101fede2963e7ac51a406bca35cab95fa45192623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59408c7cd75594e068cdc4dadfec414fcc3d1604eea37ed708440fd1a4f019ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://516e9231111cee4a53c71bef07338222497c8ffb27edbfaddbcb2e58af61ae7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2097cbd81d5aedb02fafaae3f17840da75ab455e541c410ae2f70710548530ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:58Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.051960 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:07:58 crc kubenswrapper[4699]: E1122 04:07:58.052155 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:08:06.05212575 +0000 UTC m=+37.394746937 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.052226 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.052263 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:07:58 crc kubenswrapper[4699]: E1122 04:07:58.052385 4699 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 04:07:58 crc kubenswrapper[4699]: E1122 04:07:58.052409 4699 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 04:07:58 crc kubenswrapper[4699]: E1122 04:07:58.052489 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 04:08:06.052467728 +0000 UTC m=+37.395089035 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 04:07:58 crc kubenswrapper[4699]: E1122 04:07:58.052522 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 04:08:06.052509739 +0000 UTC m=+37.395131036 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.100157 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.100219 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.100236 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.100263 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.100280 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:07:58Z","lastTransitionTime":"2025-11-22T04:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.152856 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.152899 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:07:58 crc kubenswrapper[4699]: E1122 04:07:58.153026 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 04:07:58 crc kubenswrapper[4699]: E1122 04:07:58.153045 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 04:07:58 crc kubenswrapper[4699]: E1122 04:07:58.153058 4699 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:07:58 crc kubenswrapper[4699]: E1122 04:07:58.153089 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 04:07:58 crc kubenswrapper[4699]: E1122 04:07:58.153134 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 04:07:58 crc kubenswrapper[4699]: E1122 04:07:58.153159 4699 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:07:58 crc kubenswrapper[4699]: E1122 04:07:58.153112 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-22 04:08:06.153099856 +0000 UTC m=+37.495721043 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:07:58 crc kubenswrapper[4699]: E1122 04:07:58.153281 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-22 04:08:06.15324887 +0000 UTC m=+37.495870097 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.203577 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.203647 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.203671 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.203700 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.203721 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:07:58Z","lastTransitionTime":"2025-11-22T04:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.306791 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.306853 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.306872 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.306896 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.306913 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:07:58Z","lastTransitionTime":"2025-11-22T04:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.410665 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.410732 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.410744 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.410770 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.410788 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:07:58Z","lastTransitionTime":"2025-11-22T04:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.447743 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:07:58 crc kubenswrapper[4699]: E1122 04:07:58.447995 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.448254 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:07:58 crc kubenswrapper[4699]: E1122 04:07:58.448383 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.448477 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:07:58 crc kubenswrapper[4699]: E1122 04:07:58.448569 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.513258 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.513315 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.513327 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.513351 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.513364 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:07:58Z","lastTransitionTime":"2025-11-22T04:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.610911 4699 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.611869 4699 scope.go:117] "RemoveContainer" containerID="a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f" Nov 22 04:07:58 crc kubenswrapper[4699]: E1122 04:07:58.612094 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.616637 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.616778 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.616800 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.616827 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.616886 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:07:58Z","lastTransitionTime":"2025-11-22T04:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.720258 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.720324 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.720343 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.720368 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.720387 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:07:58Z","lastTransitionTime":"2025-11-22T04:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.823418 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.823517 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.823536 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.823563 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.823580 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:07:58Z","lastTransitionTime":"2025-11-22T04:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.927074 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.927139 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.927158 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.927196 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:07:58 crc kubenswrapper[4699]: I1122 04:07:58.927214 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:07:58Z","lastTransitionTime":"2025-11-22T04:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.030847 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.030901 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.030914 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.030932 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.030946 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:07:59Z","lastTransitionTime":"2025-11-22T04:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.133023 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.133060 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.133071 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.133089 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.133100 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:07:59Z","lastTransitionTime":"2025-11-22T04:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.236002 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.236061 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.236078 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.236104 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.236121 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:07:59Z","lastTransitionTime":"2025-11-22T04:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.339241 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.339301 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.339318 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.339343 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.339365 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:07:59Z","lastTransitionTime":"2025-11-22T04:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.442068 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.442586 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.442603 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.442624 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.442638 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:07:59Z","lastTransitionTime":"2025-11-22T04:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.483224 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b193b41e-aa0e-4816-b965-7b7873dadf85\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd4757f265f2b7a453efca645d83d5340e5ec206f6f9d40dd86010b90470498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1996517d6f55ae1765dd9d101fede2963e7ac51a406bca35cab95fa45192623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59408c7cd75594e068cdc4dadfec414fcc3d1604eea37ed708440fd1a4f019ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://516e9231111cee4a53c71bef07338222497c8ffb27edbfaddbcb2e58af61ae7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2097cbd81d5aedb02fafaae3f17840da75ab455e541c410ae2f70710548530ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.500860 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86ztb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d15248-9724-41b0-8370-66127cc18bbe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08e180e0857112708a5ca84fc45cd41b9aebc5eef5628d5666abc590d86242e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-799vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86ztb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.524689 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z7552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.544871 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e855881-4d77-4655-b4d7-a50fc081f993\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545a27e66130160ef1d8557458a64a27f18292c157e2e6dab9aa75aea0532ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35e9c8adb3bd9249f6d7e57cd40e40951af0463e49765ba635707120d07e8b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1538d20749062691aa2368004d22a46e612186aee24cb92acc3ddb073f616a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4a053080810e22083dda4eaba1155b7b547a214158f849f7e5778f2e37ccc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.545937 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.545974 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.545993 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.546018 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.546035 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:07:59Z","lastTransitionTime":"2025-11-22T04:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.562377 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.578018 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c43ee45b5065b7baee9b0025b5a73b4915b4577169a35be4378acf0e7cb603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.592453 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h6ndp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd066499-5bd5-459c-8a02-d02f716c8965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822e0ef5b78e9c1b19b56d52c7eed8ad0058cc30b405b2adf0e2a572afdaab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hhkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h6ndp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.607537 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bdbae2-706a-4f84-9f56-5a42aec77762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc56d58ec38fe2e6ff34afa44193fd165159799c6184b7f1474c8b13087f257f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191befb5ec1036276709a4720f3cd8c40d63d14818bed55c5fac998489233619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kjwnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.624736 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b7225" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e5e536a-6797-4e6f-8160-1e23ddda1647\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b7225\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.646355 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653394-4b4d-4c44-bc9d-39f2eeadbee4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e08c778826ca87eedf7169382d30509a5d31e132f5c91ff2cf633a24e3a7dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb226d8acfbc46b2a51a6c4ef5c04c1e17d99e9e82bad5950ccb4356fcc39eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c1d8b6512002b090f6fa191cc3dc7d55aeae6d135bca5df2c367fb2a4f68c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 04:07:50.127900 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 04:07:50.128059 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:07:50.128926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2923111326/tls.crt::/tmp/serving-cert-2923111326/tls.key\\\\\\\"\\\\nI1122 04:07:50.418529 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:07:50.432499 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:07:50.432593 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:07:50.432650 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:07:50.432686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:07:50.439773 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1122 04:07:50.439810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 04:07:50.439829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439834 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:07:50.439842 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:07:50.439844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:07:50.439864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:07:50.442112 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e25f8f28cc3aca76ae535aa6084bd1f994cbd0eb679f6ea40938a7fe456b0e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.648505 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.648555 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.648571 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.648592 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.648609 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:07:59Z","lastTransitionTime":"2025-11-22T04:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.663934 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c858c4eaa869f479d0fbd62eadd41218ca8dddc7ae5ffd82d36977acde2e76ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.681272 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.702449 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pmtb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5af0f83551d8cf679ee04fbc3995afe66769f74480211fb104ebf2d6d0f9ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccx9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pmtb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.724644 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bfcbb63b703f8f023d54028af9011b37da8d2f7c9ac57e35129cd783f301876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99bfafe09aabfb9e3715d3c7af12849e0c8cb66e5799011c8463c5043383fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.733828 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b7225" event={"ID":"7e5e536a-6797-4e6f-8160-1e23ddda1647","Type":"ContainerStarted","Data":"1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c"} Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.741590 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" event={"ID":"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3","Type":"ContainerStarted","Data":"c2026c9f8cee707c298d93019fdaf6e74fcc7b074c088bcbb8e64c11c3c61c36"} Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.742598 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.742922 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.749902 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.753553 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.753576 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.753585 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.753599 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.753609 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:07:59Z","lastTransitionTime":"2025-11-22T04:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.767394 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c43ee45b5065b7baee9b0025b5a73b4915b4577169a35be4378acf0e7cb603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.782359 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.787294 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h6ndp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd066499-5bd5-459c-8a02-d02f716c8965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822e0ef5b78e9c1b19b56d52c7eed8ad0058cc30b405b2adf0e2a572afdaab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hhkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h6ndp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.788913 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.804567 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bdbae2-706a-4f84-9f56-5a42aec77762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc56d58ec38fe2e6ff34afa44193fd165159799c6184b7f1474c8b13087f257f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191befb5ec1036276709a4720f3cd8c40d63d14818bed55c5fac998489233619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kjwnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.821491 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b7225" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e5e536a-6797-4e6f-8160-1e23ddda1647\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b7225\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.841000 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653394-4b4d-4c44-bc9d-39f2eeadbee4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e08c778826ca87eedf7169382d30509a5d31e132f5c91ff2cf633a24e3a7dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb226d8acfbc46b2a51a6c4ef5c04c1e17d99e9e82bad5950ccb4356fcc39eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c1d8b6512002b090f6fa191cc3dc7d55aeae6d135bca5df2c367fb2a4f68c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 04:07:50.127900 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 04:07:50.128059 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:07:50.128926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2923111326/tls.crt::/tmp/serving-cert-2923111326/tls.key\\\\\\\"\\\\nI1122 04:07:50.418529 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:07:50.432499 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:07:50.432593 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:07:50.432650 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:07:50.432686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:07:50.439773 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1122 04:07:50.439810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 04:07:50.439829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439834 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:07:50.439842 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:07:50.439844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:07:50.439864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:07:50.442112 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e25f8f28cc3aca76ae535aa6084bd1f994cbd0eb679f6ea40938a7fe456b0e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.856035 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.856083 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.856094 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.856117 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.856132 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:07:59Z","lastTransitionTime":"2025-11-22T04:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.857202 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.881998 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pmtb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5af0f83551d8cf679ee04fbc3995afe66769f74480211fb104ebf2d6d0f9ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccx9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pmtb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.900137 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c858c4eaa869f479d0fbd62eadd41218ca8dddc7ae5ffd82d36977acde2e76ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.917196 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.929797 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.942036 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bfcbb63b703f8f023d54028af9011b37da8d2f7c9ac57e35129cd783f301876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99bfafe09aabfb9e3715d3c7af12849e0c8cb66e5799011c8463c5043383fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.958246 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.958346 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.958359 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.958377 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.958388 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:07:59Z","lastTransitionTime":"2025-11-22T04:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.964776 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2026c9f8cee707c298d93019fdaf6e74fcc7b074c088bcbb8e64c11c3c61c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z7552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:07:59 crc kubenswrapper[4699]: I1122 04:07:59.985561 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e855881-4d77-4655-b4d7-a50fc081f993\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545a27e66130160ef1d8557458a64a27f18292c157e2e6dab9aa75aea0532ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35e9c8adb3bd9249f6d7e57cd40e40951af0463e49765ba635707120d07e8b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1538d20749062691aa2368004d22a46e612186aee24cb92acc3ddb073f616a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4a053080810e22083dda4eaba1155b7b547a214158f849f7e5778f2e37ccc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:07:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.012868 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b193b41e-aa0e-4816-b965-7b7873dadf85\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd4757f265f2b7a453efca645d83d5340e5ec206f6f9d40dd86010b90470498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1996517d6f55ae1765dd9d101fede2963e7ac51a406bca35cab95fa45192623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59408c7cd75594e068cdc4dadfec414fcc3d1604eea37ed708440fd1a4f019ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://516e9231111cee4a53c71bef07338222497c8ffb27edbfaddbcb2e58af61ae7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2097cbd81d5aedb02fafaae3f17840da75ab455e541c410ae2f70710548530ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.027039 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86ztb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d15248-9724-41b0-8370-66127cc18bbe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08e180e0857112708a5ca84fc45cd41b9aebc5eef5628d5666abc590d86242e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-799vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86ztb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.042367 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.058369 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c43ee45b5065b7baee9b0025b5a73b4915b4577169a35be4378acf0e7cb603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.061385 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.061459 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.061478 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.061502 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.061515 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:00Z","lastTransitionTime":"2025-11-22T04:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.072808 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h6ndp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd066499-5bd5-459c-8a02-d02f716c8965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822e0ef5b78e9c1b19b56d52c7eed8ad0058cc30b405b2adf0e2a572afdaab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hhkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h6ndp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.088181 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bdbae2-706a-4f84-9f56-5a42aec77762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc56d58ec38fe2e6ff34afa44193fd165159799c6184b7f1474c8b13087f257f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191befb5ec1036276709a4720f3cd8c40d63d14818bed55c5fac998489233619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kjwnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.109097 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b7225" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e5e536a-6797-4e6f-8160-1e23ddda1647\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b7225\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.127099 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653394-4b4d-4c44-bc9d-39f2eeadbee4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e08c778826ca87eedf7169382d30509a5d31e132f5c91ff2cf633a24e3a7dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb226d8acfbc46b2a51a6c4ef5c04c1e17d99e9e82bad5950ccb4356fcc39eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c1d8b6512002b090f6fa191cc3dc7d55aeae6d135bca5df2c367fb2a4f68c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 04:07:50.127900 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 04:07:50.128059 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:07:50.128926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2923111326/tls.crt::/tmp/serving-cert-2923111326/tls.key\\\\\\\"\\\\nI1122 04:07:50.418529 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:07:50.432499 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:07:50.432593 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:07:50.432650 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:07:50.432686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:07:50.439773 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1122 04:07:50.439810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 04:07:50.439829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439834 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:07:50.439842 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:07:50.439844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:07:50.439864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:07:50.442112 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e25f8f28cc3aca76ae535aa6084bd1f994cbd0eb679f6ea40938a7fe456b0e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.145760 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.164771 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.165011 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.165088 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.165188 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.165263 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:00Z","lastTransitionTime":"2025-11-22T04:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.167132 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pmtb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5af0f83551d8cf679ee04fbc3995afe66769f74480211fb104ebf2d6d0f9ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccx9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pmtb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.184642 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c858c4eaa869f479d0fbd62eadd41218ca8dddc7ae5ffd82d36977acde2e76ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.204149 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.220529 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bfcbb63b703f8f023d54028af9011b37da8d2f7c9ac57e35129cd783f301876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99bfafe09aabfb9e3715d3c7af12849e0c8cb66e5799011c8463c5043383fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.236987 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86ztb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d15248-9724-41b0-8370-66127cc18bbe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08e180e0857112708a5ca84fc45cd41b9aebc5eef5628d5666abc590d86242e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-799vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86ztb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.259245 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2026c9f8cee707c298d93019fdaf6e74fcc7b074c088bcbb8e64c11c3c61c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z7552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.268383 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.268460 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.268522 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.268543 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.268557 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:00Z","lastTransitionTime":"2025-11-22T04:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.279061 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e855881-4d77-4655-b4d7-a50fc081f993\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545a27e66130160ef1d8557458a64a27f18292c157e2e6dab9aa75aea0532ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35e9c8adb3bd9249f6d7e57cd40e40951af0463e49765ba635707120d07e8b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1538d20749062691aa2368004d22a46e612186aee24cb92acc3ddb073f616a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4a053080810e22083dda4eaba1155b7b547a214158f849f7e5778f2e37ccc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.313255 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b193b41e-aa0e-4816-b965-7b7873dadf85\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd4757f265f2b7a453efca645d83d5340e5ec206f6f9d40dd86010b90470498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1996517d6f55ae1765dd9d101fede2963e7ac51a406bca35cab95fa45192623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59408c7cd75594e068cdc4dadfec414fcc3d1604eea37ed708440fd1a4f019ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://516e9231111cee4a53c71bef07338222497c8ffb27edbfaddbcb2e58af61ae7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2097cbd81d5aedb02fafaae3f17840da75ab455e541c410ae2f70710548530ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.371841 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.371906 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.371920 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.371946 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.371976 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:00Z","lastTransitionTime":"2025-11-22T04:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.447859 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.447955 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.447886 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:08:00 crc kubenswrapper[4699]: E1122 04:08:00.448111 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:08:00 crc kubenswrapper[4699]: E1122 04:08:00.448278 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:08:00 crc kubenswrapper[4699]: E1122 04:08:00.448759 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.474851 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.474892 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.474902 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.474926 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.474939 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:00Z","lastTransitionTime":"2025-11-22T04:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.578485 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.578542 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.578560 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.578582 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.578593 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:00Z","lastTransitionTime":"2025-11-22T04:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.681921 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.681999 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.682019 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.682047 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.682065 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:00Z","lastTransitionTime":"2025-11-22T04:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.749720 4699 generic.go:334] "Generic (PLEG): container finished" podID="7e5e536a-6797-4e6f-8160-1e23ddda1647" containerID="1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c" exitCode=0 Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.749814 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b7225" event={"ID":"7e5e536a-6797-4e6f-8160-1e23ddda1647","Type":"ContainerDied","Data":"1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c"} Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.749975 4699 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.771267 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pmtb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5af0f83551d8cf679ee04fbc3995afe66769f74480211fb104ebf2d6d0f9ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccx9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pmtb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.785749 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.785898 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.785913 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.785936 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.785950 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:00Z","lastTransitionTime":"2025-11-22T04:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.800562 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c858c4eaa869f479d0fbd62eadd41218ca8dddc7ae5ffd82d36977acde2e76ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.818668 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.837404 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.854125 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bfcbb63b703f8f023d54028af9011b37da8d2f7c9ac57e35129cd783f301876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99bfafe09aabfb9e3715d3c7af12849e0c8cb66e5799011c8463c5043383fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.877648 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2026c9f8cee707c298d93019fdaf6e74fcc7b074c088bcbb8e64c11c3c61c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z7552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.889346 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.889396 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.889406 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.889447 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.889466 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:00Z","lastTransitionTime":"2025-11-22T04:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.891076 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e855881-4d77-4655-b4d7-a50fc081f993\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545a27e66130160ef1d8557458a64a27f18292c157e2e6dab9aa75aea0532ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35e9c8adb3bd9249f6d7e57cd40e40951af0463e49765ba635707120d07e8b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1538d20749062691aa2368004d22a46e612186aee24cb92acc3ddb073f616a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4a053080810e22083dda4eaba1155b7b547a214158f849f7e5778f2e37ccc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.910973 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b193b41e-aa0e-4816-b965-7b7873dadf85\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd4757f265f2b7a453efca645d83d5340e5ec206f6f9d40dd86010b90470498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1996517d6f55ae1765dd9d101fede2963e7ac51a406bca35cab95fa45192623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59408c7cd75594e068cdc4dadfec414fcc3d1604eea37ed708440fd1a4f019ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://516e9231111cee4a53c71bef07338222497c8ffb27edbfaddbcb2e58af61ae7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2097cbd81d5aedb02fafaae3f17840da75ab455e541c410ae2f70710548530ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.922094 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86ztb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d15248-9724-41b0-8370-66127cc18bbe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08e180e0857112708a5ca84fc45cd41b9aebc5eef5628d5666abc590d86242e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-799vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86ztb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.934838 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c43ee45b5065b7baee9b0025b5a73b4915b4577169a35be4378acf0e7cb603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.945630 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h6ndp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd066499-5bd5-459c-8a02-d02f716c8965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822e0ef5b78e9c1b19b56d52c7eed8ad0058cc30b405b2adf0e2a572afdaab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hhkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h6ndp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.959776 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bdbae2-706a-4f84-9f56-5a42aec77762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc56d58ec38fe2e6ff34afa44193fd165159799c6184b7f1474c8b13087f257f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191befb5ec1036276709a4720f3cd8c40d63d14818bed55c5fac998489233619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kjwnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.978775 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b7225" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e5e536a-6797-4e6f-8160-1e23ddda1647\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b7225\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.992128 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.992166 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.992177 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.992196 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.992207 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:00Z","lastTransitionTime":"2025-11-22T04:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:00 crc kubenswrapper[4699]: I1122 04:08:00.997927 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653394-4b4d-4c44-bc9d-39f2eeadbee4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e08c778826ca87eedf7169382d30509a5d31e132f5c91ff2cf633a24e3a7dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb226d8acfbc46b2a51a6c4ef5c04c1e17d99e9e82bad5950ccb4356fcc39eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c1d8b6512002b090f6fa191cc3dc7d55aeae6d135bca5df2c367fb2a4f68c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 04:07:50.127900 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 04:07:50.128059 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:07:50.128926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2923111326/tls.crt::/tmp/serving-cert-2923111326/tls.key\\\\\\\"\\\\nI1122 04:07:50.418529 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:07:50.432499 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:07:50.432593 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:07:50.432650 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:07:50.432686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:07:50.439773 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1122 04:07:50.439810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 04:07:50.439829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439834 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:07:50.439842 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:07:50.439844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:07:50.439864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:07:50.442112 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e25f8f28cc3aca76ae535aa6084bd1f994cbd0eb679f6ea40938a7fe456b0e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:00Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.015295 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:01Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.095402 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.095502 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.095517 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.095551 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.095567 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:01Z","lastTransitionTime":"2025-11-22T04:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.199166 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.199240 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.199252 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.199275 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.199288 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:01Z","lastTransitionTime":"2025-11-22T04:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.303182 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.303244 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.303257 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.303280 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.303296 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:01Z","lastTransitionTime":"2025-11-22T04:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.406578 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.406828 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.406844 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.406866 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.406881 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:01Z","lastTransitionTime":"2025-11-22T04:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.509552 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.509605 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.509621 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.509649 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.509665 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:01Z","lastTransitionTime":"2025-11-22T04:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.612714 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.612783 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.612801 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.612829 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.612847 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:01Z","lastTransitionTime":"2025-11-22T04:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.716329 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.716421 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.716505 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.716542 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.716564 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:01Z","lastTransitionTime":"2025-11-22T04:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.758928 4699 generic.go:334] "Generic (PLEG): container finished" podID="7e5e536a-6797-4e6f-8160-1e23ddda1647" containerID="854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03" exitCode=0 Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.759040 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b7225" event={"ID":"7e5e536a-6797-4e6f-8160-1e23ddda1647","Type":"ContainerDied","Data":"854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03"} Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.759085 4699 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.777587 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pmtb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5af0f83551d8cf679ee04fbc3995afe66769f74480211fb104ebf2d6d0f9ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccx9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pmtb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:01Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.801929 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c858c4eaa869f479d0fbd62eadd41218ca8dddc7ae5ffd82d36977acde2e76ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:01Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.816533 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:01Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.820270 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.820313 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.820325 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.820345 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.820376 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:01Z","lastTransitionTime":"2025-11-22T04:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.831332 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:01Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.857198 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bfcbb63b703f8f023d54028af9011b37da8d2f7c9ac57e35129cd783f301876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99bfafe09aabfb9e3715d3c7af12849e0c8cb66e5799011c8463c5043383fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:01Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.884072 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2026c9f8cee707c298d93019fdaf6e74fcc7b074c088bcbb8e64c11c3c61c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z7552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:01Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.901188 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e855881-4d77-4655-b4d7-a50fc081f993\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545a27e66130160ef1d8557458a64a27f18292c157e2e6dab9aa75aea0532ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35e9c8adb3bd9249f6d7e57cd40e40951af0463e49765ba635707120d07e8b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1538d20749062691aa2368004d22a46e612186aee24cb92acc3ddb073f616a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4a053080810e22083dda4eaba1155b7b547a214158f849f7e5778f2e37ccc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:01Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.921409 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b193b41e-aa0e-4816-b965-7b7873dadf85\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd4757f265f2b7a453efca645d83d5340e5ec206f6f9d40dd86010b90470498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1996517d6f55ae1765dd9d101fede2963e7ac51a406bca35cab95fa45192623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59408c7cd75594e068cdc4dadfec414fcc3d1604eea37ed708440fd1a4f019ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://516e9231111cee4a53c71bef07338222497c8ffb27edbfaddbcb2e58af61ae7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2097cbd81d5aedb02fafaae3f17840da75ab455e541c410ae2f70710548530ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:01Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.923760 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.923812 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.923826 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.923848 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.923860 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:01Z","lastTransitionTime":"2025-11-22T04:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.932966 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86ztb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d15248-9724-41b0-8370-66127cc18bbe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08e180e0857112708a5ca84fc45cd41b9aebc5eef5628d5666abc590d86242e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-799vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86ztb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:01Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.947062 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c43ee45b5065b7baee9b0025b5a73b4915b4577169a35be4378acf0e7cb603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:01Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.958481 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h6ndp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd066499-5bd5-459c-8a02-d02f716c8965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822e0ef5b78e9c1b19b56d52c7eed8ad0058cc30b405b2adf0e2a572afdaab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hhkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h6ndp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:01Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.972014 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bdbae2-706a-4f84-9f56-5a42aec77762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc56d58ec38fe2e6ff34afa44193fd165159799c6184b7f1474c8b13087f257f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191befb5ec1036276709a4720f3cd8c40d63d14818bed55c5fac998489233619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kjwnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:01Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:01 crc kubenswrapper[4699]: I1122 04:08:01.988150 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b7225" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e5e536a-6797-4e6f-8160-1e23ddda1647\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b7225\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:01Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.003254 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653394-4b4d-4c44-bc9d-39f2eeadbee4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e08c778826ca87eedf7169382d30509a5d31e132f5c91ff2cf633a24e3a7dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb226d8acfbc46b2a51a6c4ef5c04c1e17d99e9e82bad5950ccb4356fcc39eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c1d8b6512002b090f6fa191cc3dc7d55aeae6d135bca5df2c367fb2a4f68c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 04:07:50.127900 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 04:07:50.128059 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:07:50.128926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2923111326/tls.crt::/tmp/serving-cert-2923111326/tls.key\\\\\\\"\\\\nI1122 04:07:50.418529 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:07:50.432499 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:07:50.432593 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:07:50.432650 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:07:50.432686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:07:50.439773 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1122 04:07:50.439810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 04:07:50.439829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439834 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:07:50.439842 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:07:50.439844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:07:50.439864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:07:50.442112 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e25f8f28cc3aca76ae535aa6084bd1f994cbd0eb679f6ea40938a7fe456b0e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.017754 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.027018 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.027064 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.027075 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.027094 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.027106 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:02Z","lastTransitionTime":"2025-11-22T04:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.129442 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.129486 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.129495 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.129513 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.129525 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:02Z","lastTransitionTime":"2025-11-22T04:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.232709 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.232774 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.232791 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.232816 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.232835 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:02Z","lastTransitionTime":"2025-11-22T04:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.335728 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.335813 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.335837 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.335869 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.335897 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:02Z","lastTransitionTime":"2025-11-22T04:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.438198 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.438254 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.438272 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.438298 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.438317 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:02Z","lastTransitionTime":"2025-11-22T04:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.447424 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.447549 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:08:02 crc kubenswrapper[4699]: E1122 04:08:02.447636 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.447777 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:08:02 crc kubenswrapper[4699]: E1122 04:08:02.447862 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:08:02 crc kubenswrapper[4699]: E1122 04:08:02.447963 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.540951 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.540997 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.541012 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.541033 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.541046 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:02Z","lastTransitionTime":"2025-11-22T04:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.643252 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.643318 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.643336 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.643363 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.643388 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:02Z","lastTransitionTime":"2025-11-22T04:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.747241 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.747279 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.747288 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.747304 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.747312 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:02Z","lastTransitionTime":"2025-11-22T04:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.765744 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b7225" event={"ID":"7e5e536a-6797-4e6f-8160-1e23ddda1647","Type":"ContainerStarted","Data":"07e7b4e6ae273aa9999ce9d0f198b8a9317611f11ddb313258aed23e3feff339"} Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.782675 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c858c4eaa869f479d0fbd62eadd41218ca8dddc7ae5ffd82d36977acde2e76ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.797326 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.813178 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pmtb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5af0f83551d8cf679ee04fbc3995afe66769f74480211fb104ebf2d6d0f9ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccx9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pmtb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.827990 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.843122 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bfcbb63b703f8f023d54028af9011b37da8d2f7c9ac57e35129cd783f301876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99bfafe09aabfb9e3715d3c7af12849e0c8cb66e5799011c8463c5043383fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.850124 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.850164 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.850175 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.850193 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.850203 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:02Z","lastTransitionTime":"2025-11-22T04:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.864048 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e855881-4d77-4655-b4d7-a50fc081f993\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545a27e66130160ef1d8557458a64a27f18292c157e2e6dab9aa75aea0532ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35e9c8adb3bd9249f6d7e57cd40e40951af0463e49765ba635707120d07e8b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1538d20749062691aa2368004d22a46e612186aee24cb92acc3ddb073f616a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4a053080810e22083dda4eaba1155b7b547a214158f849f7e5778f2e37ccc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.885892 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b193b41e-aa0e-4816-b965-7b7873dadf85\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd4757f265f2b7a453efca645d83d5340e5ec206f6f9d40dd86010b90470498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1996517d6f55ae1765dd9d101fede2963e7ac51a406bca35cab95fa45192623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59408c7cd75594e068cdc4dadfec414fcc3d1604eea37ed708440fd1a4f019ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://516e9231111cee4a53c71bef07338222497c8ffb27edbfaddbcb2e58af61ae7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2097cbd81d5aedb02fafaae3f17840da75ab455e541c410ae2f70710548530ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.897802 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86ztb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d15248-9724-41b0-8370-66127cc18bbe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08e180e0857112708a5ca84fc45cd41b9aebc5eef5628d5666abc590d86242e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-799vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86ztb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.914971 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2026c9f8cee707c298d93019fdaf6e74fcc7b074c088bcbb8e64c11c3c61c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z7552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.931675 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b7225" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e5e536a-6797-4e6f-8160-1e23ddda1647\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07e7b4e6ae273aa9999ce9d0f198b8a9317611f11ddb313258aed23e3feff339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b7225\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.948612 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653394-4b4d-4c44-bc9d-39f2eeadbee4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e08c778826ca87eedf7169382d30509a5d31e132f5c91ff2cf633a24e3a7dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb226d8acfbc46b2a51a6c4ef5c04c1e17d99e9e82bad5950ccb4356fcc39eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c1d8b6512002b090f6fa191cc3dc7d55aeae6d135bca5df2c367fb2a4f68c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 04:07:50.127900 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 04:07:50.128059 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:07:50.128926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2923111326/tls.crt::/tmp/serving-cert-2923111326/tls.key\\\\\\\"\\\\nI1122 04:07:50.418529 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:07:50.432499 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:07:50.432593 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:07:50.432650 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:07:50.432686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:07:50.439773 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1122 04:07:50.439810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 04:07:50.439829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439834 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:07:50.439842 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:07:50.439844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:07:50.439864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:07:50.442112 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e25f8f28cc3aca76ae535aa6084bd1f994cbd0eb679f6ea40938a7fe456b0e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.952081 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.952126 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.952137 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.952153 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.952164 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:02Z","lastTransitionTime":"2025-11-22T04:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.964465 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.978501 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c43ee45b5065b7baee9b0025b5a73b4915b4577169a35be4378acf0e7cb603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:02 crc kubenswrapper[4699]: I1122 04:08:02.989893 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h6ndp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd066499-5bd5-459c-8a02-d02f716c8965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822e0ef5b78e9c1b19b56d52c7eed8ad0058cc30b405b2adf0e2a572afdaab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hhkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h6ndp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.005683 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bdbae2-706a-4f84-9f56-5a42aec77762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc56d58ec38fe2e6ff34afa44193fd165159799c6184b7f1474c8b13087f257f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191befb5ec1036276709a4720f3cd8c40d63d14818bed55c5fac998489233619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kjwnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:03Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.054933 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.054967 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.054976 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.054990 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.055000 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:03Z","lastTransitionTime":"2025-11-22T04:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.158048 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.158116 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.158138 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.158169 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.158191 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:03Z","lastTransitionTime":"2025-11-22T04:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.261376 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.261427 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.261489 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.261573 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.261595 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:03Z","lastTransitionTime":"2025-11-22T04:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.365085 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.365130 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.365141 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.365158 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.365168 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:03Z","lastTransitionTime":"2025-11-22T04:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.468829 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.468914 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.468936 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.468967 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.468991 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:03Z","lastTransitionTime":"2025-11-22T04:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.572400 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.572485 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.572506 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.572532 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.572549 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:03Z","lastTransitionTime":"2025-11-22T04:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.675730 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.676115 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.676274 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.676423 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.676586 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:03Z","lastTransitionTime":"2025-11-22T04:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.774495 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z7552_fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3/ovnkube-controller/0.log" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.781643 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.781714 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.781735 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.781760 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.781777 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:03Z","lastTransitionTime":"2025-11-22T04:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.784677 4699 generic.go:334] "Generic (PLEG): container finished" podID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerID="c2026c9f8cee707c298d93019fdaf6e74fcc7b074c088bcbb8e64c11c3c61c36" exitCode=1 Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.784734 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" event={"ID":"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3","Type":"ContainerDied","Data":"c2026c9f8cee707c298d93019fdaf6e74fcc7b074c088bcbb8e64c11c3c61c36"} Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.786393 4699 scope.go:117] "RemoveContainer" containerID="c2026c9f8cee707c298d93019fdaf6e74fcc7b074c088bcbb8e64c11c3c61c36" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.809547 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:03Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.834369 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bfcbb63b703f8f023d54028af9011b37da8d2f7c9ac57e35129cd783f301876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99bfafe09aabfb9e3715d3c7af12849e0c8cb66e5799011c8463c5043383fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:03Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.866198 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2026c9f8cee707c298d93019fdaf6e74fcc7b074c088bcbb8e64c11c3c61c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2026c9f8cee707c298d93019fdaf6e74fcc7b074c088bcbb8e64c11c3c61c36\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:08:03Z\\\",\\\"message\\\":\\\"ubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1122 04:08:03.118657 5950 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:08:03.118879 5950 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:08:03.119224 5950 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:08:03.119368 5950 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:08:03.119723 5950 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:08:03.120184 5950 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1122 04:08:03.120237 5950 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1122 04:08:03.120285 5950 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1122 04:08:03.120331 5950 factory.go:656] Stopping watch factory\\\\nI1122 04:08:03.120368 5950 ovnkube.go:599] Stopped ovnkube\\\\nI1122 04:08:03.120426 5950 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1122 04:08:03.120494 5950 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1122 04:08:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z7552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:03Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.882970 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e855881-4d77-4655-b4d7-a50fc081f993\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545a27e66130160ef1d8557458a64a27f18292c157e2e6dab9aa75aea0532ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35e9c8adb3bd9249f6d7e57cd40e40951af0463e49765ba635707120d07e8b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1538d20749062691aa2368004d22a46e612186aee24cb92acc3ddb073f616a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4a053080810e22083dda4eaba1155b7b547a214158f849f7e5778f2e37ccc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:03Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.884983 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.885059 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.885077 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.885111 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.885129 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:03Z","lastTransitionTime":"2025-11-22T04:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.905159 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b193b41e-aa0e-4816-b965-7b7873dadf85\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd4757f265f2b7a453efca645d83d5340e5ec206f6f9d40dd86010b90470498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1996517d6f55ae1765dd9d101fede2963e7ac51a406bca35cab95fa45192623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59408c7cd75594e068cdc4dadfec414fcc3d1604eea37ed708440fd1a4f019ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://516e9231111cee4a53c71bef07338222497c8ffb27edbfaddbcb2e58af61ae7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2097cbd81d5aedb02fafaae3f17840da75ab455e541c410ae2f70710548530ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:03Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.906678 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gqt5x"] Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.907605 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gqt5x" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.909919 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.911017 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.932894 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86ztb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d15248-9724-41b0-8370-66127cc18bbe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08e180e0857112708a5ca84fc45cd41b9aebc5eef5628d5666abc590d86242e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-799vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86ztb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:03Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.966379 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c43ee45b5065b7baee9b0025b5a73b4915b4577169a35be4378acf0e7cb603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:03Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.983800 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h6ndp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd066499-5bd5-459c-8a02-d02f716c8965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822e0ef5b78e9c1b19b56d52c7eed8ad0058cc30b405b2adf0e2a572afdaab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hhkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h6ndp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:03Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.987300 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.987338 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.987349 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.987368 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:03 crc kubenswrapper[4699]: I1122 04:08:03.987379 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:03Z","lastTransitionTime":"2025-11-22T04:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.005521 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bdbae2-706a-4f84-9f56-5a42aec77762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc56d58ec38fe2e6ff34afa44193fd165159799c6184b7f1474c8b13087f257f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191befb5ec1036276709a4720f3cd8c40d63d14818bed55c5fac998489233619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kjwnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.020043 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/686f15a0-53ce-4d3f-80e2-7d6272dc7d4d-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-gqt5x\" (UID: \"686f15a0-53ce-4d3f-80e2-7d6272dc7d4d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gqt5x" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.020083 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/686f15a0-53ce-4d3f-80e2-7d6272dc7d4d-env-overrides\") pod \"ovnkube-control-plane-749d76644c-gqt5x\" (UID: \"686f15a0-53ce-4d3f-80e2-7d6272dc7d4d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gqt5x" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.020305 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxlj2\" (UniqueName: \"kubernetes.io/projected/686f15a0-53ce-4d3f-80e2-7d6272dc7d4d-kube-api-access-mxlj2\") pod \"ovnkube-control-plane-749d76644c-gqt5x\" (UID: \"686f15a0-53ce-4d3f-80e2-7d6272dc7d4d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gqt5x" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.020471 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/686f15a0-53ce-4d3f-80e2-7d6272dc7d4d-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-gqt5x\" (UID: \"686f15a0-53ce-4d3f-80e2-7d6272dc7d4d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gqt5x" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.027340 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b7225" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e5e536a-6797-4e6f-8160-1e23ddda1647\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07e7b4e6ae273aa9999ce9d0f198b8a9317611f11ddb313258aed23e3feff339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b7225\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.043333 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653394-4b4d-4c44-bc9d-39f2eeadbee4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e08c778826ca87eedf7169382d30509a5d31e132f5c91ff2cf633a24e3a7dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb226d8acfbc46b2a51a6c4ef5c04c1e17d99e9e82bad5950ccb4356fcc39eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c1d8b6512002b090f6fa191cc3dc7d55aeae6d135bca5df2c367fb2a4f68c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 04:07:50.127900 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 04:07:50.128059 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:07:50.128926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2923111326/tls.crt::/tmp/serving-cert-2923111326/tls.key\\\\\\\"\\\\nI1122 04:07:50.418529 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:07:50.432499 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:07:50.432593 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:07:50.432650 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:07:50.432686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:07:50.439773 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1122 04:07:50.439810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 04:07:50.439829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439834 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:07:50.439842 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:07:50.439844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:07:50.439864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:07:50.442112 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e25f8f28cc3aca76ae535aa6084bd1f994cbd0eb679f6ea40938a7fe456b0e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.056564 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.070310 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pmtb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5af0f83551d8cf679ee04fbc3995afe66769f74480211fb104ebf2d6d0f9ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccx9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pmtb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.085517 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c858c4eaa869f479d0fbd62eadd41218ca8dddc7ae5ffd82d36977acde2e76ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.089755 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.089805 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.089814 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.089833 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.089843 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:04Z","lastTransitionTime":"2025-11-22T04:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.102987 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.116242 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653394-4b4d-4c44-bc9d-39f2eeadbee4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e08c778826ca87eedf7169382d30509a5d31e132f5c91ff2cf633a24e3a7dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb226d8acfbc46b2a51a6c4ef5c04c1e17d99e9e82bad5950ccb4356fcc39eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c1d8b6512002b090f6fa191cc3dc7d55aeae6d135bca5df2c367fb2a4f68c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 04:07:50.127900 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 04:07:50.128059 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:07:50.128926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2923111326/tls.crt::/tmp/serving-cert-2923111326/tls.key\\\\\\\"\\\\nI1122 04:07:50.418529 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:07:50.432499 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:07:50.432593 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:07:50.432650 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:07:50.432686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:07:50.439773 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1122 04:07:50.439810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 04:07:50.439829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439834 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:07:50.439842 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:07:50.439844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:07:50.439864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:07:50.442112 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e25f8f28cc3aca76ae535aa6084bd1f994cbd0eb679f6ea40938a7fe456b0e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.121985 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxlj2\" (UniqueName: \"kubernetes.io/projected/686f15a0-53ce-4d3f-80e2-7d6272dc7d4d-kube-api-access-mxlj2\") pod \"ovnkube-control-plane-749d76644c-gqt5x\" (UID: \"686f15a0-53ce-4d3f-80e2-7d6272dc7d4d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gqt5x" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.122068 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/686f15a0-53ce-4d3f-80e2-7d6272dc7d4d-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-gqt5x\" (UID: \"686f15a0-53ce-4d3f-80e2-7d6272dc7d4d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gqt5x" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.122099 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/686f15a0-53ce-4d3f-80e2-7d6272dc7d4d-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-gqt5x\" (UID: \"686f15a0-53ce-4d3f-80e2-7d6272dc7d4d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gqt5x" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.122119 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/686f15a0-53ce-4d3f-80e2-7d6272dc7d4d-env-overrides\") pod \"ovnkube-control-plane-749d76644c-gqt5x\" (UID: \"686f15a0-53ce-4d3f-80e2-7d6272dc7d4d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gqt5x" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.122782 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/686f15a0-53ce-4d3f-80e2-7d6272dc7d4d-env-overrides\") pod \"ovnkube-control-plane-749d76644c-gqt5x\" (UID: \"686f15a0-53ce-4d3f-80e2-7d6272dc7d4d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gqt5x" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.123178 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/686f15a0-53ce-4d3f-80e2-7d6272dc7d4d-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-gqt5x\" (UID: \"686f15a0-53ce-4d3f-80e2-7d6272dc7d4d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gqt5x" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.127713 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/686f15a0-53ce-4d3f-80e2-7d6272dc7d4d-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-gqt5x\" (UID: \"686f15a0-53ce-4d3f-80e2-7d6272dc7d4d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gqt5x" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.130170 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.138259 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxlj2\" (UniqueName: \"kubernetes.io/projected/686f15a0-53ce-4d3f-80e2-7d6272dc7d4d-kube-api-access-mxlj2\") pod \"ovnkube-control-plane-749d76644c-gqt5x\" (UID: \"686f15a0-53ce-4d3f-80e2-7d6272dc7d4d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gqt5x" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.143371 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c43ee45b5065b7baee9b0025b5a73b4915b4577169a35be4378acf0e7cb603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.152756 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h6ndp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd066499-5bd5-459c-8a02-d02f716c8965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822e0ef5b78e9c1b19b56d52c7eed8ad0058cc30b405b2adf0e2a572afdaab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hhkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h6ndp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.163557 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bdbae2-706a-4f84-9f56-5a42aec77762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc56d58ec38fe2e6ff34afa44193fd165159799c6184b7f1474c8b13087f257f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191befb5ec1036276709a4720f3cd8c40d63d14818bed55c5fac998489233619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kjwnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.179077 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b7225" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e5e536a-6797-4e6f-8160-1e23ddda1647\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07e7b4e6ae273aa9999ce9d0f198b8a9317611f11ddb313258aed23e3feff339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b7225\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.192478 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c858c4eaa869f479d0fbd62eadd41218ca8dddc7ae5ffd82d36977acde2e76ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.196792 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.196866 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.196888 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.196916 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.196937 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:04Z","lastTransitionTime":"2025-11-22T04:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.205238 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.217913 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pmtb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5af0f83551d8cf679ee04fbc3995afe66769f74480211fb104ebf2d6d0f9ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccx9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pmtb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.226416 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gqt5x" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.232617 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:04 crc kubenswrapper[4699]: W1122 04:08:04.240749 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod686f15a0_53ce_4d3f_80e2_7d6272dc7d4d.slice/crio-fb7010edda617f6c49152925557f48dc3ccca790a312d8eadc0665fde8142145 WatchSource:0}: Error finding container fb7010edda617f6c49152925557f48dc3ccca790a312d8eadc0665fde8142145: Status 404 returned error can't find the container with id fb7010edda617f6c49152925557f48dc3ccca790a312d8eadc0665fde8142145 Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.247587 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bfcbb63b703f8f023d54028af9011b37da8d2f7c9ac57e35129cd783f301876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99bfafe09aabfb9e3715d3c7af12849e0c8cb66e5799011c8463c5043383fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.263050 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e855881-4d77-4655-b4d7-a50fc081f993\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545a27e66130160ef1d8557458a64a27f18292c157e2e6dab9aa75aea0532ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35e9c8adb3bd9249f6d7e57cd40e40951af0463e49765ba635707120d07e8b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1538d20749062691aa2368004d22a46e612186aee24cb92acc3ddb073f616a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4a053080810e22083dda4eaba1155b7b547a214158f849f7e5778f2e37ccc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.287808 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b193b41e-aa0e-4816-b965-7b7873dadf85\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd4757f265f2b7a453efca645d83d5340e5ec206f6f9d40dd86010b90470498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1996517d6f55ae1765dd9d101fede2963e7ac51a406bca35cab95fa45192623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59408c7cd75594e068cdc4dadfec414fcc3d1604eea37ed708440fd1a4f019ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://516e9231111cee4a53c71bef07338222497c8ffb27edbfaddbcb2e58af61ae7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2097cbd81d5aedb02fafaae3f17840da75ab455e541c410ae2f70710548530ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.298509 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86ztb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d15248-9724-41b0-8370-66127cc18bbe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08e180e0857112708a5ca84fc45cd41b9aebc5eef5628d5666abc590d86242e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-799vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86ztb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.299682 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.299739 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.299758 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.299813 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.299831 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:04Z","lastTransitionTime":"2025-11-22T04:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.326571 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2026c9f8cee707c298d93019fdaf6e74fcc7b074c088bcbb8e64c11c3c61c36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2026c9f8cee707c298d93019fdaf6e74fcc7b074c088bcbb8e64c11c3c61c36\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:08:03Z\\\",\\\"message\\\":\\\"ubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1122 04:08:03.118657 5950 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:08:03.118879 5950 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:08:03.119224 5950 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:08:03.119368 5950 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:08:03.119723 5950 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:08:03.120184 5950 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1122 04:08:03.120237 5950 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1122 04:08:03.120285 5950 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1122 04:08:03.120331 5950 factory.go:656] Stopping watch factory\\\\nI1122 04:08:03.120368 5950 ovnkube.go:599] Stopped ovnkube\\\\nI1122 04:08:03.120426 5950 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1122 04:08:03.120494 5950 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1122 04:08:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z7552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.343130 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gqt5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686f15a0-53ce-4d3f-80e2-7d6272dc7d4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxlj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxlj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:08:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gqt5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.402251 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.402291 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.402300 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.402315 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.402325 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:04Z","lastTransitionTime":"2025-11-22T04:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.447210 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.447268 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.447224 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:08:04 crc kubenswrapper[4699]: E1122 04:08:04.447414 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:08:04 crc kubenswrapper[4699]: E1122 04:08:04.447527 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:08:04 crc kubenswrapper[4699]: E1122 04:08:04.447734 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.504259 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.504291 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.504301 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.504314 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.504323 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:04Z","lastTransitionTime":"2025-11-22T04:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.607409 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.607499 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.607515 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.607536 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.607553 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:04Z","lastTransitionTime":"2025-11-22T04:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.710297 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.710354 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.710370 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.710395 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.710411 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:04Z","lastTransitionTime":"2025-11-22T04:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.790326 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z7552_fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3/ovnkube-controller/0.log" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.793089 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" event={"ID":"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3","Type":"ContainerStarted","Data":"4bc11f794671091f44b26888a6b2e95b17d76dec770be187a8ce9cea8c7c9688"} Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.793292 4699 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.793968 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gqt5x" event={"ID":"686f15a0-53ce-4d3f-80e2-7d6272dc7d4d","Type":"ContainerStarted","Data":"fb7010edda617f6c49152925557f48dc3ccca790a312d8eadc0665fde8142145"} Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.814548 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.814588 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.814599 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.814614 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.814625 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:04Z","lastTransitionTime":"2025-11-22T04:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.818168 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.841485 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c43ee45b5065b7baee9b0025b5a73b4915b4577169a35be4378acf0e7cb603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.852017 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h6ndp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd066499-5bd5-459c-8a02-d02f716c8965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822e0ef5b78e9c1b19b56d52c7eed8ad0058cc30b405b2adf0e2a572afdaab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hhkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h6ndp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.863528 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bdbae2-706a-4f84-9f56-5a42aec77762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc56d58ec38fe2e6ff34afa44193fd165159799c6184b7f1474c8b13087f257f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191befb5ec1036276709a4720f3cd8c40d63d14818bed55c5fac998489233619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kjwnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.876412 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b7225" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e5e536a-6797-4e6f-8160-1e23ddda1647\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07e7b4e6ae273aa9999ce9d0f198b8a9317611f11ddb313258aed23e3feff339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b7225\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.891212 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653394-4b4d-4c44-bc9d-39f2eeadbee4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e08c778826ca87eedf7169382d30509a5d31e132f5c91ff2cf633a24e3a7dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb226d8acfbc46b2a51a6c4ef5c04c1e17d99e9e82bad5950ccb4356fcc39eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c1d8b6512002b090f6fa191cc3dc7d55aeae6d135bca5df2c367fb2a4f68c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 04:07:50.127900 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 04:07:50.128059 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:07:50.128926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2923111326/tls.crt::/tmp/serving-cert-2923111326/tls.key\\\\\\\"\\\\nI1122 04:07:50.418529 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:07:50.432499 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:07:50.432593 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:07:50.432650 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:07:50.432686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:07:50.439773 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1122 04:07:50.439810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 04:07:50.439829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439834 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:07:50.439842 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:07:50.439844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:07:50.439864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:07:50.442112 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e25f8f28cc3aca76ae535aa6084bd1f994cbd0eb679f6ea40938a7fe456b0e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.904124 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.916408 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.916466 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.916481 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.916513 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.916530 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:04Z","lastTransitionTime":"2025-11-22T04:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.919126 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pmtb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5af0f83551d8cf679ee04fbc3995afe66769f74480211fb104ebf2d6d0f9ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccx9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pmtb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.933527 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c858c4eaa869f479d0fbd62eadd41218ca8dddc7ae5ffd82d36977acde2e76ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.945329 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.961813 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bfcbb63b703f8f023d54028af9011b37da8d2f7c9ac57e35129cd783f301876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99bfafe09aabfb9e3715d3c7af12849e0c8cb66e5799011c8463c5043383fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:04 crc kubenswrapper[4699]: I1122 04:08:04.976288 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86ztb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d15248-9724-41b0-8370-66127cc18bbe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08e180e0857112708a5ca84fc45cd41b9aebc5eef5628d5666abc590d86242e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-799vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86ztb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.001164 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bc11f794671091f44b26888a6b2e95b17d76dec770be187a8ce9cea8c7c9688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2026c9f8cee707c298d93019fdaf6e74fcc7b074c088bcbb8e64c11c3c61c36\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:08:03Z\\\",\\\"message\\\":\\\"ubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1122 04:08:03.118657 5950 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:08:03.118879 5950 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:08:03.119224 5950 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:08:03.119368 5950 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:08:03.119723 5950 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:08:03.120184 5950 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1122 04:08:03.120237 5950 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1122 04:08:03.120285 5950 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1122 04:08:03.120331 5950 factory.go:656] Stopping watch factory\\\\nI1122 04:08:03.120368 5950 ovnkube.go:599] Stopped ovnkube\\\\nI1122 04:08:03.120426 5950 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1122 04:08:03.120494 5950 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1122 04:08:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z7552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.013597 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gqt5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686f15a0-53ce-4d3f-80e2-7d6272dc7d4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxlj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxlj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:08:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gqt5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:05Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.018131 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.018171 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.018179 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.018193 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.018202 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:05Z","lastTransitionTime":"2025-11-22T04:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.026936 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e855881-4d77-4655-b4d7-a50fc081f993\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545a27e66130160ef1d8557458a64a27f18292c157e2e6dab9aa75aea0532ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35e9c8adb3bd9249f6d7e57cd40e40951af0463e49765ba635707120d07e8b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1538d20749062691aa2368004d22a46e612186aee24cb92acc3ddb073f616a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4a053080810e22083dda4eaba1155b7b547a214158f849f7e5778f2e37ccc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:05Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.048328 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b193b41e-aa0e-4816-b965-7b7873dadf85\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd4757f265f2b7a453efca645d83d5340e5ec206f6f9d40dd86010b90470498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1996517d6f55ae1765dd9d101fede2963e7ac51a406bca35cab95fa45192623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59408c7cd75594e068cdc4dadfec414fcc3d1604eea37ed708440fd1a4f019ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://516e9231111cee4a53c71bef07338222497c8ffb27edbfaddbcb2e58af61ae7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2097cbd81d5aedb02fafaae3f17840da75ab455e541c410ae2f70710548530ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:05Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.120173 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.120216 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.120226 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.120243 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.120253 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:05Z","lastTransitionTime":"2025-11-22T04:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.223033 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.223078 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.223089 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.223111 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.223126 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:05Z","lastTransitionTime":"2025-11-22T04:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.326541 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.326597 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.326610 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.326630 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.326643 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:05Z","lastTransitionTime":"2025-11-22T04:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.384834 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-pj52w"] Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.385346 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:08:05 crc kubenswrapper[4699]: E1122 04:08:05.385422 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.403258 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bdbae2-706a-4f84-9f56-5a42aec77762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc56d58ec38fe2e6ff34afa44193fd165159799c6184b7f1474c8b13087f257f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191befb5ec1036276709a4720f3cd8c40d63d14818bed55c5fac998489233619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kjwnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:05Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.422989 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b7225" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e5e536a-6797-4e6f-8160-1e23ddda1647\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07e7b4e6ae273aa9999ce9d0f198b8a9317611f11ddb313258aed23e3feff339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b7225\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:05Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.429233 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.429279 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.429292 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.429315 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.429329 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:05Z","lastTransitionTime":"2025-11-22T04:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.435916 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77tk6\" (UniqueName: \"kubernetes.io/projected/82be5d0c-6f95-43e4-aa3c-9c56de3e200c-kube-api-access-77tk6\") pod \"network-metrics-daemon-pj52w\" (UID: \"82be5d0c-6f95-43e4-aa3c-9c56de3e200c\") " pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.436004 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/82be5d0c-6f95-43e4-aa3c-9c56de3e200c-metrics-certs\") pod \"network-metrics-daemon-pj52w\" (UID: \"82be5d0c-6f95-43e4-aa3c-9c56de3e200c\") " pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.443941 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653394-4b4d-4c44-bc9d-39f2eeadbee4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e08c778826ca87eedf7169382d30509a5d31e132f5c91ff2cf633a24e3a7dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb226d8acfbc46b2a51a6c4ef5c04c1e17d99e9e82bad5950ccb4356fcc39eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c1d8b6512002b090f6fa191cc3dc7d55aeae6d135bca5df2c367fb2a4f68c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 04:07:50.127900 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 04:07:50.128059 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:07:50.128926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2923111326/tls.crt::/tmp/serving-cert-2923111326/tls.key\\\\\\\"\\\\nI1122 04:07:50.418529 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:07:50.432499 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:07:50.432593 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:07:50.432650 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:07:50.432686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:07:50.439773 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1122 04:07:50.439810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 04:07:50.439829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439834 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:07:50.439842 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:07:50.439844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:07:50.439864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:07:50.442112 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e25f8f28cc3aca76ae535aa6084bd1f994cbd0eb679f6ea40938a7fe456b0e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:05Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.459175 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:05Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.475110 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c43ee45b5065b7baee9b0025b5a73b4915b4577169a35be4378acf0e7cb603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:05Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.485708 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h6ndp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd066499-5bd5-459c-8a02-d02f716c8965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822e0ef5b78e9c1b19b56d52c7eed8ad0058cc30b405b2adf0e2a572afdaab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hhkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h6ndp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:05Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.507896 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c858c4eaa869f479d0fbd62eadd41218ca8dddc7ae5ffd82d36977acde2e76ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:05Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.524287 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:05Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.531563 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.531618 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.531633 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.531654 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.531668 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:05Z","lastTransitionTime":"2025-11-22T04:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.537556 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/82be5d0c-6f95-43e4-aa3c-9c56de3e200c-metrics-certs\") pod \"network-metrics-daemon-pj52w\" (UID: \"82be5d0c-6f95-43e4-aa3c-9c56de3e200c\") " pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.537636 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77tk6\" (UniqueName: \"kubernetes.io/projected/82be5d0c-6f95-43e4-aa3c-9c56de3e200c-kube-api-access-77tk6\") pod \"network-metrics-daemon-pj52w\" (UID: \"82be5d0c-6f95-43e4-aa3c-9c56de3e200c\") " pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:08:05 crc kubenswrapper[4699]: E1122 04:08:05.537773 4699 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 04:08:05 crc kubenswrapper[4699]: E1122 04:08:05.537873 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82be5d0c-6f95-43e4-aa3c-9c56de3e200c-metrics-certs podName:82be5d0c-6f95-43e4-aa3c-9c56de3e200c nodeName:}" failed. No retries permitted until 2025-11-22 04:08:06.037847822 +0000 UTC m=+37.380469209 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/82be5d0c-6f95-43e4-aa3c-9c56de3e200c-metrics-certs") pod "network-metrics-daemon-pj52w" (UID: "82be5d0c-6f95-43e4-aa3c-9c56de3e200c") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.542592 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pmtb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5af0f83551d8cf679ee04fbc3995afe66769f74480211fb104ebf2d6d0f9ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccx9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pmtb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:05Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.558563 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77tk6\" (UniqueName: \"kubernetes.io/projected/82be5d0c-6f95-43e4-aa3c-9c56de3e200c-kube-api-access-77tk6\") pod \"network-metrics-daemon-pj52w\" (UID: \"82be5d0c-6f95-43e4-aa3c-9c56de3e200c\") " pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.567138 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:05Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.582159 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bfcbb63b703f8f023d54028af9011b37da8d2f7c9ac57e35129cd783f301876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99bfafe09aabfb9e3715d3c7af12849e0c8cb66e5799011c8463c5043383fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:05Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.594926 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pj52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82be5d0c-6f95-43e4-aa3c-9c56de3e200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:08:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pj52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:05Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.618612 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e855881-4d77-4655-b4d7-a50fc081f993\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545a27e66130160ef1d8557458a64a27f18292c157e2e6dab9aa75aea0532ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35e9c8adb3bd9249f6d7e57cd40e40951af0463e49765ba635707120d07e8b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1538d20749062691aa2368004d22a46e612186aee24cb92acc3ddb073f616a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4a053080810e22083dda4eaba1155b7b547a214158f849f7e5778f2e37ccc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:05Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.634362 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.634411 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.634461 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.634489 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.634503 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:05Z","lastTransitionTime":"2025-11-22T04:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.645417 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b193b41e-aa0e-4816-b965-7b7873dadf85\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd4757f265f2b7a453efca645d83d5340e5ec206f6f9d40dd86010b90470498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1996517d6f55ae1765dd9d101fede2963e7ac51a406bca35cab95fa45192623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59408c7cd75594e068cdc4dadfec414fcc3d1604eea37ed708440fd1a4f019ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://516e9231111cee4a53c71bef07338222497c8ffb27edbfaddbcb2e58af61ae7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2097cbd81d5aedb02fafaae3f17840da75ab455e541c410ae2f70710548530ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:05Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.656796 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86ztb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d15248-9724-41b0-8370-66127cc18bbe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08e180e0857112708a5ca84fc45cd41b9aebc5eef5628d5666abc590d86242e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-799vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86ztb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:05Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.680352 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bc11f794671091f44b26888a6b2e95b17d76dec770be187a8ce9cea8c7c9688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2026c9f8cee707c298d93019fdaf6e74fcc7b074c088bcbb8e64c11c3c61c36\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:08:03Z\\\",\\\"message\\\":\\\"ubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1122 04:08:03.118657 5950 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:08:03.118879 5950 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:08:03.119224 5950 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:08:03.119368 5950 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:08:03.119723 5950 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:08:03.120184 5950 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1122 04:08:03.120237 5950 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1122 04:08:03.120285 5950 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1122 04:08:03.120331 5950 factory.go:656] Stopping watch factory\\\\nI1122 04:08:03.120368 5950 ovnkube.go:599] Stopped ovnkube\\\\nI1122 04:08:03.120426 5950 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1122 04:08:03.120494 5950 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1122 04:08:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z7552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:05Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.693022 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gqt5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686f15a0-53ce-4d3f-80e2-7d6272dc7d4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxlj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxlj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:08:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gqt5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:05Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.737937 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.738000 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.738019 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.738045 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.738061 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:05Z","lastTransitionTime":"2025-11-22T04:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.799525 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gqt5x" event={"ID":"686f15a0-53ce-4d3f-80e2-7d6272dc7d4d","Type":"ContainerStarted","Data":"5501c17b8d8e321c7b94254ed053f943531df548575931c4ec091997d68572a5"} Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.841872 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.841952 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.841962 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.841997 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.842013 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:05Z","lastTransitionTime":"2025-11-22T04:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.949139 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.949200 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.949223 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.949245 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:05 crc kubenswrapper[4699]: I1122 04:08:05.949260 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:05Z","lastTransitionTime":"2025-11-22T04:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.044646 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/82be5d0c-6f95-43e4-aa3c-9c56de3e200c-metrics-certs\") pod \"network-metrics-daemon-pj52w\" (UID: \"82be5d0c-6f95-43e4-aa3c-9c56de3e200c\") " pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:08:06 crc kubenswrapper[4699]: E1122 04:08:06.044889 4699 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 04:08:06 crc kubenswrapper[4699]: E1122 04:08:06.045035 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82be5d0c-6f95-43e4-aa3c-9c56de3e200c-metrics-certs podName:82be5d0c-6f95-43e4-aa3c-9c56de3e200c nodeName:}" failed. No retries permitted until 2025-11-22 04:08:07.044998039 +0000 UTC m=+38.387619296 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/82be5d0c-6f95-43e4-aa3c-9c56de3e200c-metrics-certs") pod "network-metrics-daemon-pj52w" (UID: "82be5d0c-6f95-43e4-aa3c-9c56de3e200c") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.052657 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.052729 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.052748 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.052776 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.052794 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:06Z","lastTransitionTime":"2025-11-22T04:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.145887 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.146031 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.146105 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:08:06 crc kubenswrapper[4699]: E1122 04:08:06.146221 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:08:22.14618012 +0000 UTC m=+53.488801347 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:08:06 crc kubenswrapper[4699]: E1122 04:08:06.146241 4699 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 04:08:06 crc kubenswrapper[4699]: E1122 04:08:06.146316 4699 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 04:08:06 crc kubenswrapper[4699]: E1122 04:08:06.146373 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 04:08:22.146337874 +0000 UTC m=+53.488959131 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 04:08:06 crc kubenswrapper[4699]: E1122 04:08:06.146411 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 04:08:22.146393505 +0000 UTC m=+53.489014842 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.155426 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.155536 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.155560 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.155595 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.155616 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:06Z","lastTransitionTime":"2025-11-22T04:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.247910 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.247982 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:08:06 crc kubenswrapper[4699]: E1122 04:08:06.248145 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 04:08:06 crc kubenswrapper[4699]: E1122 04:08:06.248189 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 04:08:06 crc kubenswrapper[4699]: E1122 04:08:06.248207 4699 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:08:06 crc kubenswrapper[4699]: E1122 04:08:06.248277 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-22 04:08:22.248257323 +0000 UTC m=+53.590878520 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:08:06 crc kubenswrapper[4699]: E1122 04:08:06.248293 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 04:08:06 crc kubenswrapper[4699]: E1122 04:08:06.248368 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 04:08:06 crc kubenswrapper[4699]: E1122 04:08:06.248406 4699 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:08:06 crc kubenswrapper[4699]: E1122 04:08:06.248511 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-22 04:08:22.248493889 +0000 UTC m=+53.591115176 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.257788 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.257836 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.257853 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.257876 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.257892 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:06Z","lastTransitionTime":"2025-11-22T04:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.360500 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.360544 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.360554 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.360572 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.360582 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:06Z","lastTransitionTime":"2025-11-22T04:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.383889 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.383926 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.383934 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.383949 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.383958 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:06Z","lastTransitionTime":"2025-11-22T04:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:06 crc kubenswrapper[4699]: E1122 04:08:06.397370 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4852b328-c4f8-4280-9881-83927c94bf9a\\\",\\\"systemUUID\\\":\\\"76c96961-7d99-459e-9731-5ae805318244\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:06Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.401582 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.401619 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.401628 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.401644 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.401653 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:06Z","lastTransitionTime":"2025-11-22T04:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:06 crc kubenswrapper[4699]: E1122 04:08:06.417521 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4852b328-c4f8-4280-9881-83927c94bf9a\\\",\\\"systemUUID\\\":\\\"76c96961-7d99-459e-9731-5ae805318244\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:06Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.422107 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.422158 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.422175 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.422198 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.422214 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:06Z","lastTransitionTime":"2025-11-22T04:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:06 crc kubenswrapper[4699]: E1122 04:08:06.434708 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4852b328-c4f8-4280-9881-83927c94bf9a\\\",\\\"systemUUID\\\":\\\"76c96961-7d99-459e-9731-5ae805318244\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:06Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.438592 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.438627 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.438637 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.438652 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.438662 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:06Z","lastTransitionTime":"2025-11-22T04:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.447235 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.447264 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.447291 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:08:06 crc kubenswrapper[4699]: E1122 04:08:06.447363 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:08:06 crc kubenswrapper[4699]: E1122 04:08:06.447471 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:08:06 crc kubenswrapper[4699]: E1122 04:08:06.447550 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:08:06 crc kubenswrapper[4699]: E1122 04:08:06.451036 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4852b328-c4f8-4280-9881-83927c94bf9a\\\",\\\"systemUUID\\\":\\\"76c96961-7d99-459e-9731-5ae805318244\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:06Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.454749 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.454797 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.454808 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.454828 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.454840 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:06Z","lastTransitionTime":"2025-11-22T04:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:06 crc kubenswrapper[4699]: E1122 04:08:06.470475 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4852b328-c4f8-4280-9881-83927c94bf9a\\\",\\\"systemUUID\\\":\\\"76c96961-7d99-459e-9731-5ae805318244\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:06Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:06 crc kubenswrapper[4699]: E1122 04:08:06.470695 4699 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.473035 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.473082 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.473098 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.473126 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.473144 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:06Z","lastTransitionTime":"2025-11-22T04:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.576837 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.576879 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.576895 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.576915 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.576930 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:06Z","lastTransitionTime":"2025-11-22T04:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.680198 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.680265 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.680293 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.680322 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.680343 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:06Z","lastTransitionTime":"2025-11-22T04:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.782355 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.782406 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.782417 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.782461 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.782478 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:06Z","lastTransitionTime":"2025-11-22T04:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.884706 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.884748 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.884760 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.884776 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.884789 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:06Z","lastTransitionTime":"2025-11-22T04:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.987855 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.987943 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.987961 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.987985 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:06 crc kubenswrapper[4699]: I1122 04:08:06.988002 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:06Z","lastTransitionTime":"2025-11-22T04:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:07 crc kubenswrapper[4699]: I1122 04:08:07.058145 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/82be5d0c-6f95-43e4-aa3c-9c56de3e200c-metrics-certs\") pod \"network-metrics-daemon-pj52w\" (UID: \"82be5d0c-6f95-43e4-aa3c-9c56de3e200c\") " pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:08:07 crc kubenswrapper[4699]: E1122 04:08:07.058403 4699 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 04:08:07 crc kubenswrapper[4699]: E1122 04:08:07.058599 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82be5d0c-6f95-43e4-aa3c-9c56de3e200c-metrics-certs podName:82be5d0c-6f95-43e4-aa3c-9c56de3e200c nodeName:}" failed. No retries permitted until 2025-11-22 04:08:09.058564013 +0000 UTC m=+40.401185280 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/82be5d0c-6f95-43e4-aa3c-9c56de3e200c-metrics-certs") pod "network-metrics-daemon-pj52w" (UID: "82be5d0c-6f95-43e4-aa3c-9c56de3e200c") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 04:08:07 crc kubenswrapper[4699]: I1122 04:08:07.090949 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:07 crc kubenswrapper[4699]: I1122 04:08:07.091035 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:07 crc kubenswrapper[4699]: I1122 04:08:07.091076 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:07 crc kubenswrapper[4699]: I1122 04:08:07.091107 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:07 crc kubenswrapper[4699]: I1122 04:08:07.091166 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:07Z","lastTransitionTime":"2025-11-22T04:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:07 crc kubenswrapper[4699]: I1122 04:08:07.194852 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:07 crc kubenswrapper[4699]: I1122 04:08:07.194946 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:07 crc kubenswrapper[4699]: I1122 04:08:07.194975 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:07 crc kubenswrapper[4699]: I1122 04:08:07.195006 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:07 crc kubenswrapper[4699]: I1122 04:08:07.195031 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:07Z","lastTransitionTime":"2025-11-22T04:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:07 crc kubenswrapper[4699]: I1122 04:08:07.298340 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:07 crc kubenswrapper[4699]: I1122 04:08:07.298416 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:07 crc kubenswrapper[4699]: I1122 04:08:07.298477 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:07 crc kubenswrapper[4699]: I1122 04:08:07.298510 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:07 crc kubenswrapper[4699]: I1122 04:08:07.298533 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:07Z","lastTransitionTime":"2025-11-22T04:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:07 crc kubenswrapper[4699]: I1122 04:08:07.400857 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:07 crc kubenswrapper[4699]: I1122 04:08:07.400890 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:07 crc kubenswrapper[4699]: I1122 04:08:07.400898 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:07 crc kubenswrapper[4699]: I1122 04:08:07.400912 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:07 crc kubenswrapper[4699]: I1122 04:08:07.400920 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:07Z","lastTransitionTime":"2025-11-22T04:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:07 crc kubenswrapper[4699]: I1122 04:08:07.447662 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:08:07 crc kubenswrapper[4699]: E1122 04:08:07.447858 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:08:07 crc kubenswrapper[4699]: I1122 04:08:07.503020 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:07 crc kubenswrapper[4699]: I1122 04:08:07.503066 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:07 crc kubenswrapper[4699]: I1122 04:08:07.503078 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:07 crc kubenswrapper[4699]: I1122 04:08:07.503096 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:07 crc kubenswrapper[4699]: I1122 04:08:07.503107 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:07Z","lastTransitionTime":"2025-11-22T04:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:07 crc kubenswrapper[4699]: I1122 04:08:07.608673 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:07 crc kubenswrapper[4699]: I1122 04:08:07.608717 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:07 crc kubenswrapper[4699]: I1122 04:08:07.608726 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:07 crc kubenswrapper[4699]: I1122 04:08:07.608741 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:07 crc kubenswrapper[4699]: I1122 04:08:07.608750 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:07Z","lastTransitionTime":"2025-11-22T04:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:07 crc kubenswrapper[4699]: I1122 04:08:07.711462 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:07 crc kubenswrapper[4699]: I1122 04:08:07.711494 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:07 crc kubenswrapper[4699]: I1122 04:08:07.711504 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:07 crc kubenswrapper[4699]: I1122 04:08:07.711536 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:07 crc kubenswrapper[4699]: I1122 04:08:07.711547 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:07Z","lastTransitionTime":"2025-11-22T04:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:07 crc kubenswrapper[4699]: I1122 04:08:07.814265 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:07 crc kubenswrapper[4699]: I1122 04:08:07.814310 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:07 crc kubenswrapper[4699]: I1122 04:08:07.814323 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:07 crc kubenswrapper[4699]: I1122 04:08:07.814339 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:07 crc kubenswrapper[4699]: I1122 04:08:07.814350 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:07Z","lastTransitionTime":"2025-11-22T04:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:07 crc kubenswrapper[4699]: I1122 04:08:07.917480 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:07 crc kubenswrapper[4699]: I1122 04:08:07.917524 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:07 crc kubenswrapper[4699]: I1122 04:08:07.917536 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:07 crc kubenswrapper[4699]: I1122 04:08:07.917559 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:07 crc kubenswrapper[4699]: I1122 04:08:07.917572 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:07Z","lastTransitionTime":"2025-11-22T04:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.021304 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.021368 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.021385 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.021412 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.021467 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:08Z","lastTransitionTime":"2025-11-22T04:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.123985 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.124021 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.124032 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.124050 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.124063 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:08Z","lastTransitionTime":"2025-11-22T04:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.226672 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.226707 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.226717 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.226731 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.226741 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:08Z","lastTransitionTime":"2025-11-22T04:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.329610 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.329662 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.329675 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.329694 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.329706 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:08Z","lastTransitionTime":"2025-11-22T04:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.431933 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.431978 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.431988 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.432002 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.432012 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:08Z","lastTransitionTime":"2025-11-22T04:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.447767 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.447854 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.447884 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:08:08 crc kubenswrapper[4699]: E1122 04:08:08.448035 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:08:08 crc kubenswrapper[4699]: E1122 04:08:08.448121 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:08:08 crc kubenswrapper[4699]: E1122 04:08:08.448219 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.533793 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.533850 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.533866 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.533886 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.533900 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:08Z","lastTransitionTime":"2025-11-22T04:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.637470 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.637522 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.637541 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.637565 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.637581 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:08Z","lastTransitionTime":"2025-11-22T04:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.740156 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.740199 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.740207 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.740224 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.740233 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:08Z","lastTransitionTime":"2025-11-22T04:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.814567 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gqt5x" event={"ID":"686f15a0-53ce-4d3f-80e2-7d6272dc7d4d","Type":"ContainerStarted","Data":"17cc1c0cd69753ab441348667255f1dc34d4eae5c0579a0f84eb5d6063f7970d"} Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.843195 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.843483 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.843511 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.843542 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.843565 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:08Z","lastTransitionTime":"2025-11-22T04:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.946409 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.946508 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.946525 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.946552 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:08 crc kubenswrapper[4699]: I1122 04:08:08.946570 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:08Z","lastTransitionTime":"2025-11-22T04:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.049122 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.049200 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.049219 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.049248 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.049266 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:09Z","lastTransitionTime":"2025-11-22T04:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.077040 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/82be5d0c-6f95-43e4-aa3c-9c56de3e200c-metrics-certs\") pod \"network-metrics-daemon-pj52w\" (UID: \"82be5d0c-6f95-43e4-aa3c-9c56de3e200c\") " pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:08:09 crc kubenswrapper[4699]: E1122 04:08:09.077234 4699 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 04:08:09 crc kubenswrapper[4699]: E1122 04:08:09.077372 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82be5d0c-6f95-43e4-aa3c-9c56de3e200c-metrics-certs podName:82be5d0c-6f95-43e4-aa3c-9c56de3e200c nodeName:}" failed. No retries permitted until 2025-11-22 04:08:13.077342701 +0000 UTC m=+44.419963958 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/82be5d0c-6f95-43e4-aa3c-9c56de3e200c-metrics-certs") pod "network-metrics-daemon-pj52w" (UID: "82be5d0c-6f95-43e4-aa3c-9c56de3e200c") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.152769 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.152817 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.152829 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.152852 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.152864 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:09Z","lastTransitionTime":"2025-11-22T04:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.255708 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.255753 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.255765 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.255782 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.255795 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:09Z","lastTransitionTime":"2025-11-22T04:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.358464 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.358497 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.358506 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.358519 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.358528 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:09Z","lastTransitionTime":"2025-11-22T04:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.448009 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:08:09 crc kubenswrapper[4699]: E1122 04:08:09.448463 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.461709 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.461743 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.461752 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.461768 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.461778 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:09Z","lastTransitionTime":"2025-11-22T04:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.468381 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b7225" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e5e536a-6797-4e6f-8160-1e23ddda1647\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07e7b4e6ae273aa9999ce9d0f198b8a9317611f11ddb313258aed23e3feff339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b7225\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.483403 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653394-4b4d-4c44-bc9d-39f2eeadbee4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e08c778826ca87eedf7169382d30509a5d31e132f5c91ff2cf633a24e3a7dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb226d8acfbc46b2a51a6c4ef5c04c1e17d99e9e82bad5950ccb4356fcc39eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c1d8b6512002b090f6fa191cc3dc7d55aeae6d135bca5df2c367fb2a4f68c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 04:07:50.127900 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 04:07:50.128059 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:07:50.128926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2923111326/tls.crt::/tmp/serving-cert-2923111326/tls.key\\\\\\\"\\\\nI1122 04:07:50.418529 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:07:50.432499 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:07:50.432593 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:07:50.432650 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:07:50.432686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:07:50.439773 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1122 04:07:50.439810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 04:07:50.439829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439834 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:07:50.439842 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:07:50.439844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:07:50.439864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:07:50.442112 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e25f8f28cc3aca76ae535aa6084bd1f994cbd0eb679f6ea40938a7fe456b0e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.497115 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.512874 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c43ee45b5065b7baee9b0025b5a73b4915b4577169a35be4378acf0e7cb603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.530060 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h6ndp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd066499-5bd5-459c-8a02-d02f716c8965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822e0ef5b78e9c1b19b56d52c7eed8ad0058cc30b405b2adf0e2a572afdaab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hhkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h6ndp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.545800 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bdbae2-706a-4f84-9f56-5a42aec77762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc56d58ec38fe2e6ff34afa44193fd165159799c6184b7f1474c8b13087f257f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191befb5ec1036276709a4720f3cd8c40d63d14818bed55c5fac998489233619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kjwnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.562346 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c858c4eaa869f479d0fbd62eadd41218ca8dddc7ae5ffd82d36977acde2e76ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.565566 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.565650 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.565664 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.565727 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.565750 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:09Z","lastTransitionTime":"2025-11-22T04:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.577840 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.593624 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pmtb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5af0f83551d8cf679ee04fbc3995afe66769f74480211fb104ebf2d6d0f9ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccx9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pmtb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.608378 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.622056 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bfcbb63b703f8f023d54028af9011b37da8d2f7c9ac57e35129cd783f301876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99bfafe09aabfb9e3715d3c7af12849e0c8cb66e5799011c8463c5043383fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.634986 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pj52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82be5d0c-6f95-43e4-aa3c-9c56de3e200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:08:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pj52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.649701 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e855881-4d77-4655-b4d7-a50fc081f993\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545a27e66130160ef1d8557458a64a27f18292c157e2e6dab9aa75aea0532ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35e9c8adb3bd9249f6d7e57cd40e40951af0463e49765ba635707120d07e8b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1538d20749062691aa2368004d22a46e612186aee24cb92acc3ddb073f616a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4a053080810e22083dda4eaba1155b7b547a214158f849f7e5778f2e37ccc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.669290 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.669337 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.669348 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.669364 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.669374 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:09Z","lastTransitionTime":"2025-11-22T04:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.672975 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b193b41e-aa0e-4816-b965-7b7873dadf85\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd4757f265f2b7a453efca645d83d5340e5ec206f6f9d40dd86010b90470498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1996517d6f55ae1765dd9d101fede2963e7ac51a406bca35cab95fa45192623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59408c7cd75594e068cdc4dadfec414fcc3d1604eea37ed708440fd1a4f019ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://516e9231111cee4a53c71bef07338222497c8ffb27edbfaddbcb2e58af61ae7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2097cbd81d5aedb02fafaae3f17840da75ab455e541c410ae2f70710548530ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.685062 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86ztb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d15248-9724-41b0-8370-66127cc18bbe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08e180e0857112708a5ca84fc45cd41b9aebc5eef5628d5666abc590d86242e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-799vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86ztb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.706494 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bc11f794671091f44b26888a6b2e95b17d76dec770be187a8ce9cea8c7c9688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2026c9f8cee707c298d93019fdaf6e74fcc7b074c088bcbb8e64c11c3c61c36\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:08:03Z\\\",\\\"message\\\":\\\"ubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1122 04:08:03.118657 5950 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:08:03.118879 5950 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:08:03.119224 5950 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:08:03.119368 5950 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:08:03.119723 5950 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:08:03.120184 5950 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1122 04:08:03.120237 5950 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1122 04:08:03.120285 5950 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1122 04:08:03.120331 5950 factory.go:656] Stopping watch factory\\\\nI1122 04:08:03.120368 5950 ovnkube.go:599] Stopped ovnkube\\\\nI1122 04:08:03.120426 5950 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1122 04:08:03.120494 5950 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1122 04:08:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z7552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.720316 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gqt5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686f15a0-53ce-4d3f-80e2-7d6272dc7d4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxlj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxlj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:08:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gqt5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.771807 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.771859 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.771870 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.771890 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.771903 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:09Z","lastTransitionTime":"2025-11-22T04:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.829565 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z7552_fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3/ovnkube-controller/1.log" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.831078 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z7552_fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3/ovnkube-controller/0.log" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.835348 4699 generic.go:334] "Generic (PLEG): container finished" podID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerID="4bc11f794671091f44b26888a6b2e95b17d76dec770be187a8ce9cea8c7c9688" exitCode=1 Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.835559 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" event={"ID":"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3","Type":"ContainerDied","Data":"4bc11f794671091f44b26888a6b2e95b17d76dec770be187a8ce9cea8c7c9688"} Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.835854 4699 scope.go:117] "RemoveContainer" containerID="c2026c9f8cee707c298d93019fdaf6e74fcc7b074c088bcbb8e64c11c3c61c36" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.836827 4699 scope.go:117] "RemoveContainer" containerID="4bc11f794671091f44b26888a6b2e95b17d76dec770be187a8ce9cea8c7c9688" Nov 22 04:08:09 crc kubenswrapper[4699]: E1122 04:08:09.837070 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-z7552_openshift-ovn-kubernetes(fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.854738 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653394-4b4d-4c44-bc9d-39f2eeadbee4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e08c778826ca87eedf7169382d30509a5d31e132f5c91ff2cf633a24e3a7dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb226d8acfbc46b2a51a6c4ef5c04c1e17d99e9e82bad5950ccb4356fcc39eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c1d8b6512002b090f6fa191cc3dc7d55aeae6d135bca5df2c367fb2a4f68c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 04:07:50.127900 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 04:07:50.128059 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:07:50.128926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2923111326/tls.crt::/tmp/serving-cert-2923111326/tls.key\\\\\\\"\\\\nI1122 04:07:50.418529 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:07:50.432499 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:07:50.432593 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:07:50.432650 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:07:50.432686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:07:50.439773 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1122 04:07:50.439810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 04:07:50.439829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439834 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:07:50.439842 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:07:50.439844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:07:50.439864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:07:50.442112 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e25f8f28cc3aca76ae535aa6084bd1f994cbd0eb679f6ea40938a7fe456b0e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.868986 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.873660 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.873705 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.873718 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.873738 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.873750 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:09Z","lastTransitionTime":"2025-11-22T04:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.880568 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c43ee45b5065b7baee9b0025b5a73b4915b4577169a35be4378acf0e7cb603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.892541 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h6ndp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd066499-5bd5-459c-8a02-d02f716c8965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822e0ef5b78e9c1b19b56d52c7eed8ad0058cc30b405b2adf0e2a572afdaab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hhkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h6ndp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.906863 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bdbae2-706a-4f84-9f56-5a42aec77762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc56d58ec38fe2e6ff34afa44193fd165159799c6184b7f1474c8b13087f257f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191befb5ec1036276709a4720f3cd8c40d63d14818bed55c5fac998489233619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kjwnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.921891 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b7225" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e5e536a-6797-4e6f-8160-1e23ddda1647\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07e7b4e6ae273aa9999ce9d0f198b8a9317611f11ddb313258aed23e3feff339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b7225\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.938333 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c858c4eaa869f479d0fbd62eadd41218ca8dddc7ae5ffd82d36977acde2e76ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.952125 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.969370 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pmtb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5af0f83551d8cf679ee04fbc3995afe66769f74480211fb104ebf2d6d0f9ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccx9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pmtb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.976689 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.976734 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.976747 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.976765 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.976776 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:09Z","lastTransitionTime":"2025-11-22T04:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.981178 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:09 crc kubenswrapper[4699]: I1122 04:08:09.992336 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bfcbb63b703f8f023d54028af9011b37da8d2f7c9ac57e35129cd783f301876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99bfafe09aabfb9e3715d3c7af12849e0c8cb66e5799011c8463c5043383fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.001098 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pj52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82be5d0c-6f95-43e4-aa3c-9c56de3e200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:08:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pj52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.013378 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e855881-4d77-4655-b4d7-a50fc081f993\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545a27e66130160ef1d8557458a64a27f18292c157e2e6dab9aa75aea0532ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35e9c8adb3bd9249f6d7e57cd40e40951af0463e49765ba635707120d07e8b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1538d20749062691aa2368004d22a46e612186aee24cb92acc3ddb073f616a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4a053080810e22083dda4eaba1155b7b547a214158f849f7e5778f2e37ccc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:10Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.036419 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b193b41e-aa0e-4816-b965-7b7873dadf85\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd4757f265f2b7a453efca645d83d5340e5ec206f6f9d40dd86010b90470498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1996517d6f55ae1765dd9d101fede2963e7ac51a406bca35cab95fa45192623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59408c7cd75594e068cdc4dadfec414fcc3d1604eea37ed708440fd1a4f019ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://516e9231111cee4a53c71bef07338222497c8ffb27edbfaddbcb2e58af61ae7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2097cbd81d5aedb02fafaae3f17840da75ab455e541c410ae2f70710548530ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:10Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.049051 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86ztb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d15248-9724-41b0-8370-66127cc18bbe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08e180e0857112708a5ca84fc45cd41b9aebc5eef5628d5666abc590d86242e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-799vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86ztb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:10Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.070612 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bc11f794671091f44b26888a6b2e95b17d76dec770be187a8ce9cea8c7c9688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2026c9f8cee707c298d93019fdaf6e74fcc7b074c088bcbb8e64c11c3c61c36\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:08:03Z\\\",\\\"message\\\":\\\"ubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1122 04:08:03.118657 5950 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:08:03.118879 5950 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:08:03.119224 5950 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:08:03.119368 5950 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:08:03.119723 5950 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:08:03.120184 5950 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1122 04:08:03.120237 5950 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1122 04:08:03.120285 5950 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1122 04:08:03.120331 5950 factory.go:656] Stopping watch factory\\\\nI1122 04:08:03.120368 5950 ovnkube.go:599] Stopped ovnkube\\\\nI1122 04:08:03.120426 5950 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1122 04:08:03.120494 5950 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1122 04:08:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z7552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:10Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.080246 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.080328 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.080344 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.080362 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.080898 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:10Z","lastTransitionTime":"2025-11-22T04:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.085259 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gqt5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686f15a0-53ce-4d3f-80e2-7d6272dc7d4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5501c17b8d8e321c7b94254ed053f943531df548575931c4ec091997d68572a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxlj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cc1c0cd69753ab441348667255f1dc34d4eae5c0579a0f84eb5d6063f7970d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxlj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:08:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gqt5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:10Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.099880 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c858c4eaa869f479d0fbd62eadd41218ca8dddc7ae5ffd82d36977acde2e76ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:10Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.112366 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:10Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.122747 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pmtb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5af0f83551d8cf679ee04fbc3995afe66769f74480211fb104ebf2d6d0f9ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccx9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pmtb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:10Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.136932 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:10Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.152965 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bfcbb63b703f8f023d54028af9011b37da8d2f7c9ac57e35129cd783f301876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99bfafe09aabfb9e3715d3c7af12849e0c8cb66e5799011c8463c5043383fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:10Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.166578 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pj52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82be5d0c-6f95-43e4-aa3c-9c56de3e200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:08:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pj52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:10Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.179698 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e855881-4d77-4655-b4d7-a50fc081f993\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545a27e66130160ef1d8557458a64a27f18292c157e2e6dab9aa75aea0532ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35e9c8adb3bd9249f6d7e57cd40e40951af0463e49765ba635707120d07e8b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1538d20749062691aa2368004d22a46e612186aee24cb92acc3ddb073f616a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4a053080810e22083dda4eaba1155b7b547a214158f849f7e5778f2e37ccc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:10Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.183647 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.183727 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.183747 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.183780 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.183802 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:10Z","lastTransitionTime":"2025-11-22T04:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.212102 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b193b41e-aa0e-4816-b965-7b7873dadf85\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd4757f265f2b7a453efca645d83d5340e5ec206f6f9d40dd86010b90470498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1996517d6f55ae1765dd9d101fede2963e7ac51a406bca35cab95fa45192623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59408c7cd75594e068cdc4dadfec414fcc3d1604eea37ed708440fd1a4f019ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://516e9231111cee4a53c71bef07338222497c8ffb27edbfaddbcb2e58af61ae7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2097cbd81d5aedb02fafaae3f17840da75ab455e541c410ae2f70710548530ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:10Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.251761 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86ztb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d15248-9724-41b0-8370-66127cc18bbe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08e180e0857112708a5ca84fc45cd41b9aebc5eef5628d5666abc590d86242e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-799vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86ztb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:10Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.278647 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bc11f794671091f44b26888a6b2e95b17d76dec770be187a8ce9cea8c7c9688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2026c9f8cee707c298d93019fdaf6e74fcc7b074c088bcbb8e64c11c3c61c36\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:08:03Z\\\",\\\"message\\\":\\\"ubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1122 04:08:03.118657 5950 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:08:03.118879 5950 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:08:03.119224 5950 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:08:03.119368 5950 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:08:03.119723 5950 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:08:03.120184 5950 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1122 04:08:03.120237 5950 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1122 04:08:03.120285 5950 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1122 04:08:03.120331 5950 factory.go:656] Stopping watch factory\\\\nI1122 04:08:03.120368 5950 ovnkube.go:599] Stopped ovnkube\\\\nI1122 04:08:03.120426 5950 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1122 04:08:03.120494 5950 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1122 04:08:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bc11f794671091f44b26888a6b2e95b17d76dec770be187a8ce9cea8c7c9688\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"message\\\":\\\"cs/network-check-target for network=default are: map[]\\\\nI1122 04:08:05.185222 6150 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1122 04:08:05.185233 6150 services_controller.go:443] Built service openshift-network-diagnostics/network-check-target LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.219\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:80, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1122 04:08:05.185246 6150 services_controller.go:444] Built service openshift-network-diagnostics/network-check-target LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1122 04:08:05.185254 6150 services_controller.go:445] Built service openshift-network-diagnostics/network-check-target LB template configs for network=default: []services.lbConfig(nil)\\\\nF1122 04:08:05.185278 6150 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not ad\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z7552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:10Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.286507 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.286559 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.286572 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.286591 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.286605 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:10Z","lastTransitionTime":"2025-11-22T04:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.291580 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gqt5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686f15a0-53ce-4d3f-80e2-7d6272dc7d4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5501c17b8d8e321c7b94254ed053f943531df548575931c4ec091997d68572a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxlj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cc1c0cd69753ab441348667255f1dc34d4eae5c0579a0f84eb5d6063f7970d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxlj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:08:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gqt5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:10Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.306693 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b7225" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e5e536a-6797-4e6f-8160-1e23ddda1647\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07e7b4e6ae273aa9999ce9d0f198b8a9317611f11ddb313258aed23e3feff339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b7225\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:10Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.319983 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653394-4b4d-4c44-bc9d-39f2eeadbee4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e08c778826ca87eedf7169382d30509a5d31e132f5c91ff2cf633a24e3a7dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb226d8acfbc46b2a51a6c4ef5c04c1e17d99e9e82bad5950ccb4356fcc39eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c1d8b6512002b090f6fa191cc3dc7d55aeae6d135bca5df2c367fb2a4f68c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 04:07:50.127900 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 04:07:50.128059 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:07:50.128926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2923111326/tls.crt::/tmp/serving-cert-2923111326/tls.key\\\\\\\"\\\\nI1122 04:07:50.418529 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:07:50.432499 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:07:50.432593 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:07:50.432650 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:07:50.432686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:07:50.439773 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1122 04:07:50.439810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 04:07:50.439829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439834 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:07:50.439842 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:07:50.439844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:07:50.439864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:07:50.442112 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e25f8f28cc3aca76ae535aa6084bd1f994cbd0eb679f6ea40938a7fe456b0e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:10Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.333321 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:10Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.344830 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c43ee45b5065b7baee9b0025b5a73b4915b4577169a35be4378acf0e7cb603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:10Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.358428 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h6ndp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd066499-5bd5-459c-8a02-d02f716c8965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822e0ef5b78e9c1b19b56d52c7eed8ad0058cc30b405b2adf0e2a572afdaab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hhkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h6ndp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:10Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.373138 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bdbae2-706a-4f84-9f56-5a42aec77762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc56d58ec38fe2e6ff34afa44193fd165159799c6184b7f1474c8b13087f257f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191befb5ec1036276709a4720f3cd8c40d63d14818bed55c5fac998489233619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kjwnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:10Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.389451 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.389505 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.389517 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.389536 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.389554 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:10Z","lastTransitionTime":"2025-11-22T04:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.447133 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.447182 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.447133 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:08:10 crc kubenswrapper[4699]: E1122 04:08:10.447313 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:08:10 crc kubenswrapper[4699]: E1122 04:08:10.447409 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:08:10 crc kubenswrapper[4699]: E1122 04:08:10.447779 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.492812 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.492863 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.492876 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.492895 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.492908 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:10Z","lastTransitionTime":"2025-11-22T04:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.594982 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.595021 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.595034 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.595055 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.595068 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:10Z","lastTransitionTime":"2025-11-22T04:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.697410 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.697472 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.697483 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.697501 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.697513 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:10Z","lastTransitionTime":"2025-11-22T04:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.800211 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.800300 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.800326 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.800356 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.800380 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:10Z","lastTransitionTime":"2025-11-22T04:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.844084 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z7552_fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3/ovnkube-controller/1.log" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.904828 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.904907 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.904929 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.904959 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:10 crc kubenswrapper[4699]: I1122 04:08:10.904982 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:10Z","lastTransitionTime":"2025-11-22T04:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.009007 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.009052 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.009064 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.009084 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.009099 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:11Z","lastTransitionTime":"2025-11-22T04:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.112674 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.112753 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.112772 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.112804 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.112823 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:11Z","lastTransitionTime":"2025-11-22T04:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.216119 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.216168 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.216178 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.216198 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.216211 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:11Z","lastTransitionTime":"2025-11-22T04:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.319224 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.319348 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.319368 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.319399 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.319419 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:11Z","lastTransitionTime":"2025-11-22T04:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.422043 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.422086 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.422098 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.422114 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.422126 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:11Z","lastTransitionTime":"2025-11-22T04:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.447746 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:08:11 crc kubenswrapper[4699]: E1122 04:08:11.448000 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.524763 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.524829 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.524876 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.524905 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.524924 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:11Z","lastTransitionTime":"2025-11-22T04:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.628183 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.628232 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.628246 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.628267 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.628283 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:11Z","lastTransitionTime":"2025-11-22T04:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.730475 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.730515 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.730524 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.730539 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.730549 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:11Z","lastTransitionTime":"2025-11-22T04:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.834107 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.834169 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.834186 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.834214 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.834232 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:11Z","lastTransitionTime":"2025-11-22T04:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.936555 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.936624 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.936643 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.936668 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:11 crc kubenswrapper[4699]: I1122 04:08:11.936686 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:11Z","lastTransitionTime":"2025-11-22T04:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.039343 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.039393 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.039408 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.039428 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.039465 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:12Z","lastTransitionTime":"2025-11-22T04:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.142932 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.143012 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.143035 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.143062 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.143084 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:12Z","lastTransitionTime":"2025-11-22T04:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.246382 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.246470 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.246488 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.246514 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.246531 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:12Z","lastTransitionTime":"2025-11-22T04:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.349326 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.349385 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.349403 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.349499 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.349521 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:12Z","lastTransitionTime":"2025-11-22T04:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.447870 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.447974 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.447880 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:08:12 crc kubenswrapper[4699]: E1122 04:08:12.448070 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:08:12 crc kubenswrapper[4699]: E1122 04:08:12.448226 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:08:12 crc kubenswrapper[4699]: E1122 04:08:12.448307 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.452762 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.452823 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.452837 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.452855 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.452868 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:12Z","lastTransitionTime":"2025-11-22T04:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.556352 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.556482 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.556502 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.556525 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.556579 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:12Z","lastTransitionTime":"2025-11-22T04:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.659614 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.659689 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.659711 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.659741 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.659759 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:12Z","lastTransitionTime":"2025-11-22T04:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.762924 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.762968 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.762981 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.762999 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.763013 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:12Z","lastTransitionTime":"2025-11-22T04:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.865642 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.865698 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.865715 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.865740 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.865757 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:12Z","lastTransitionTime":"2025-11-22T04:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.968928 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.968984 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.969001 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.969029 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:12 crc kubenswrapper[4699]: I1122 04:08:12.969046 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:12Z","lastTransitionTime":"2025-11-22T04:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.072083 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.072138 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.072149 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.072168 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.072181 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:13Z","lastTransitionTime":"2025-11-22T04:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.124957 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/82be5d0c-6f95-43e4-aa3c-9c56de3e200c-metrics-certs\") pod \"network-metrics-daemon-pj52w\" (UID: \"82be5d0c-6f95-43e4-aa3c-9c56de3e200c\") " pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:08:13 crc kubenswrapper[4699]: E1122 04:08:13.125260 4699 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 04:08:13 crc kubenswrapper[4699]: E1122 04:08:13.125406 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82be5d0c-6f95-43e4-aa3c-9c56de3e200c-metrics-certs podName:82be5d0c-6f95-43e4-aa3c-9c56de3e200c nodeName:}" failed. No retries permitted until 2025-11-22 04:08:21.12537265 +0000 UTC m=+52.467994037 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/82be5d0c-6f95-43e4-aa3c-9c56de3e200c-metrics-certs") pod "network-metrics-daemon-pj52w" (UID: "82be5d0c-6f95-43e4-aa3c-9c56de3e200c") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.175549 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.175604 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.175615 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.175634 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.175653 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:13Z","lastTransitionTime":"2025-11-22T04:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.278775 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.278816 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.278824 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.278844 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.278853 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:13Z","lastTransitionTime":"2025-11-22T04:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.388564 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.388621 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.388632 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.388647 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.388660 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:13Z","lastTransitionTime":"2025-11-22T04:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.447682 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:08:13 crc kubenswrapper[4699]: E1122 04:08:13.448124 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.448469 4699 scope.go:117] "RemoveContainer" containerID="a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f" Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.494548 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.494589 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.494601 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.494621 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.494634 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:13Z","lastTransitionTime":"2025-11-22T04:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.598072 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.598112 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.598122 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.598137 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.598146 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:13Z","lastTransitionTime":"2025-11-22T04:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.701491 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.701558 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.701570 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.701589 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.701621 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:13Z","lastTransitionTime":"2025-11-22T04:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.804704 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.804754 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.804770 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.804790 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.804802 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:13Z","lastTransitionTime":"2025-11-22T04:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.864576 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.866458 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fdc4bf8d58b05d0044acc289a36a4eb6a4de51d5d0643239ff81fd7faff4531d"} Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.867767 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.889452 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:13Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.905385 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c43ee45b5065b7baee9b0025b5a73b4915b4577169a35be4378acf0e7cb603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:13Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.908342 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.908636 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.908653 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.908715 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.908730 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:13Z","lastTransitionTime":"2025-11-22T04:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.920639 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h6ndp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd066499-5bd5-459c-8a02-d02f716c8965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822e0ef5b78e9c1b19b56d52c7eed8ad0058cc30b405b2adf0e2a572afdaab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hhkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h6ndp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:13Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.933282 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bdbae2-706a-4f84-9f56-5a42aec77762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc56d58ec38fe2e6ff34afa44193fd165159799c6184b7f1474c8b13087f257f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191befb5ec1036276709a4720f3cd8c40d63d14818bed55c5fac998489233619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kjwnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:13Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.949071 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b7225" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e5e536a-6797-4e6f-8160-1e23ddda1647\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07e7b4e6ae273aa9999ce9d0f198b8a9317611f11ddb313258aed23e3feff339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b7225\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:13Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.965816 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653394-4b4d-4c44-bc9d-39f2eeadbee4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e08c778826ca87eedf7169382d30509a5d31e132f5c91ff2cf633a24e3a7dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb226d8acfbc46b2a51a6c4ef5c04c1e17d99e9e82bad5950ccb4356fcc39eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c1d8b6512002b090f6fa191cc3dc7d55aeae6d135bca5df2c367fb2a4f68c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc4bf8d58b05d0044acc289a36a4eb6a4de51d5d0643239ff81fd7faff4531d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 04:07:50.127900 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 04:07:50.128059 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:07:50.128926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2923111326/tls.crt::/tmp/serving-cert-2923111326/tls.key\\\\\\\"\\\\nI1122 04:07:50.418529 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:07:50.432499 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:07:50.432593 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:07:50.432650 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:07:50.432686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:07:50.439773 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1122 04:07:50.439810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 04:07:50.439829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439834 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:07:50.439842 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:07:50.439844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:07:50.439864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:07:50.442112 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e25f8f28cc3aca76ae535aa6084bd1f994cbd0eb679f6ea40938a7fe456b0e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:13Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:13 crc kubenswrapper[4699]: I1122 04:08:13.981334 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:13Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.006690 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pmtb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5af0f83551d8cf679ee04fbc3995afe66769f74480211fb104ebf2d6d0f9ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccx9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pmtb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:13Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.011720 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.011761 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.011771 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.011790 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.011801 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:14Z","lastTransitionTime":"2025-11-22T04:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.020798 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c858c4eaa869f479d0fbd62eadd41218ca8dddc7ae5ffd82d36977acde2e76ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:14Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.034533 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pj52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82be5d0c-6f95-43e4-aa3c-9c56de3e200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:08:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pj52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:14Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.048878 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:14Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.070007 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bfcbb63b703f8f023d54028af9011b37da8d2f7c9ac57e35129cd783f301876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99bfafe09aabfb9e3715d3c7af12849e0c8cb66e5799011c8463c5043383fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:14Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.086498 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86ztb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d15248-9724-41b0-8370-66127cc18bbe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08e180e0857112708a5ca84fc45cd41b9aebc5eef5628d5666abc590d86242e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-799vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86ztb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:14Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.110706 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bc11f794671091f44b26888a6b2e95b17d76dec770be187a8ce9cea8c7c9688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2026c9f8cee707c298d93019fdaf6e74fcc7b074c088bcbb8e64c11c3c61c36\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:08:03Z\\\",\\\"message\\\":\\\"ubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1122 04:08:03.118657 5950 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:08:03.118879 5950 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:08:03.119224 5950 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:08:03.119368 5950 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:08:03.119723 5950 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1122 04:08:03.120184 5950 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1122 04:08:03.120237 5950 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1122 04:08:03.120285 5950 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1122 04:08:03.120331 5950 factory.go:656] Stopping watch factory\\\\nI1122 04:08:03.120368 5950 ovnkube.go:599] Stopped ovnkube\\\\nI1122 04:08:03.120426 5950 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1122 04:08:03.120494 5950 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1122 04:08:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bc11f794671091f44b26888a6b2e95b17d76dec770be187a8ce9cea8c7c9688\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"message\\\":\\\"cs/network-check-target for network=default are: map[]\\\\nI1122 04:08:05.185222 6150 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1122 04:08:05.185233 6150 services_controller.go:443] Built service openshift-network-diagnostics/network-check-target LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.219\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:80, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1122 04:08:05.185246 6150 services_controller.go:444] Built service openshift-network-diagnostics/network-check-target LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1122 04:08:05.185254 6150 services_controller.go:445] Built service openshift-network-diagnostics/network-check-target LB template configs for network=default: []services.lbConfig(nil)\\\\nF1122 04:08:05.185278 6150 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not ad\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z7552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:14Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.115132 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.115179 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.115190 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.115210 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.115255 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:14Z","lastTransitionTime":"2025-11-22T04:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.130701 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gqt5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686f15a0-53ce-4d3f-80e2-7d6272dc7d4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5501c17b8d8e321c7b94254ed053f943531df548575931c4ec091997d68572a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxlj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cc1c0cd69753ab441348667255f1dc34d4eae5c0579a0f84eb5d6063f7970d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxlj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:08:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gqt5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:14Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.147792 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e855881-4d77-4655-b4d7-a50fc081f993\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545a27e66130160ef1d8557458a64a27f18292c157e2e6dab9aa75aea0532ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35e9c8adb3bd9249f6d7e57cd40e40951af0463e49765ba635707120d07e8b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1538d20749062691aa2368004d22a46e612186aee24cb92acc3ddb073f616a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4a053080810e22083dda4eaba1155b7b547a214158f849f7e5778f2e37ccc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:14Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.182196 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b193b41e-aa0e-4816-b965-7b7873dadf85\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd4757f265f2b7a453efca645d83d5340e5ec206f6f9d40dd86010b90470498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1996517d6f55ae1765dd9d101fede2963e7ac51a406bca35cab95fa45192623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59408c7cd75594e068cdc4dadfec414fcc3d1604eea37ed708440fd1a4f019ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://516e9231111cee4a53c71bef07338222497c8ffb27edbfaddbcb2e58af61ae7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2097cbd81d5aedb02fafaae3f17840da75ab455e541c410ae2f70710548530ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:14Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.218952 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.219044 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.219098 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.219127 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.219193 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:14Z","lastTransitionTime":"2025-11-22T04:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.323605 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.323702 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.323720 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.323774 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.323791 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:14Z","lastTransitionTime":"2025-11-22T04:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.426931 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.427012 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.427037 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.427069 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.427092 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:14Z","lastTransitionTime":"2025-11-22T04:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.447601 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.447758 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.447596 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:08:14 crc kubenswrapper[4699]: E1122 04:08:14.447963 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:08:14 crc kubenswrapper[4699]: E1122 04:08:14.447783 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:08:14 crc kubenswrapper[4699]: E1122 04:08:14.448196 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.535130 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.535202 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.535260 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.535291 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.535312 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:14Z","lastTransitionTime":"2025-11-22T04:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.639056 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.639132 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.639146 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.639172 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.639186 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:14Z","lastTransitionTime":"2025-11-22T04:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.742957 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.743582 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.743641 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.743677 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.743700 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:14Z","lastTransitionTime":"2025-11-22T04:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.846803 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.846857 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.846870 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.846889 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.846903 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:14Z","lastTransitionTime":"2025-11-22T04:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.949968 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.950035 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.950050 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.950074 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:14 crc kubenswrapper[4699]: I1122 04:08:14.950089 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:14Z","lastTransitionTime":"2025-11-22T04:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.010549 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.011767 4699 scope.go:117] "RemoveContainer" containerID="4bc11f794671091f44b26888a6b2e95b17d76dec770be187a8ce9cea8c7c9688" Nov 22 04:08:15 crc kubenswrapper[4699]: E1122 04:08:15.012029 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-z7552_openshift-ovn-kubernetes(fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.028420 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:15Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.045587 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bfcbb63b703f8f023d54028af9011b37da8d2f7c9ac57e35129cd783f301876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99bfafe09aabfb9e3715d3c7af12849e0c8cb66e5799011c8463c5043383fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:15Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.053667 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.053731 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.053746 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.053795 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.053820 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:15Z","lastTransitionTime":"2025-11-22T04:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.063465 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pj52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82be5d0c-6f95-43e4-aa3c-9c56de3e200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:08:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pj52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:15Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.085250 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e855881-4d77-4655-b4d7-a50fc081f993\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545a27e66130160ef1d8557458a64a27f18292c157e2e6dab9aa75aea0532ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35e9c8adb3bd9249f6d7e57cd40e40951af0463e49765ba635707120d07e8b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1538d20749062691aa2368004d22a46e612186aee24cb92acc3ddb073f616a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4a053080810e22083dda4eaba1155b7b547a214158f849f7e5778f2e37ccc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:15Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.110118 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b193b41e-aa0e-4816-b965-7b7873dadf85\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd4757f265f2b7a453efca645d83d5340e5ec206f6f9d40dd86010b90470498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1996517d6f55ae1765dd9d101fede2963e7ac51a406bca35cab95fa45192623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59408c7cd75594e068cdc4dadfec414fcc3d1604eea37ed708440fd1a4f019ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://516e9231111cee4a53c71bef07338222497c8ffb27edbfaddbcb2e58af61ae7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2097cbd81d5aedb02fafaae3f17840da75ab455e541c410ae2f70710548530ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:15Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.126496 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86ztb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d15248-9724-41b0-8370-66127cc18bbe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08e180e0857112708a5ca84fc45cd41b9aebc5eef5628d5666abc590d86242e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-799vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86ztb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:15Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.151098 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bc11f794671091f44b26888a6b2e95b17d76dec770be187a8ce9cea8c7c9688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bc11f794671091f44b26888a6b2e95b17d76dec770be187a8ce9cea8c7c9688\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"message\\\":\\\"cs/network-check-target for network=default are: map[]\\\\nI1122 04:08:05.185222 6150 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1122 04:08:05.185233 6150 services_controller.go:443] Built service openshift-network-diagnostics/network-check-target LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.219\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:80, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1122 04:08:05.185246 6150 services_controller.go:444] Built service openshift-network-diagnostics/network-check-target LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1122 04:08:05.185254 6150 services_controller.go:445] Built service openshift-network-diagnostics/network-check-target LB template configs for network=default: []services.lbConfig(nil)\\\\nF1122 04:08:05.185278 6150 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not ad\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:08:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-z7552_openshift-ovn-kubernetes(fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z7552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:15Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.157073 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.157130 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.157143 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.157160 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.157193 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:15Z","lastTransitionTime":"2025-11-22T04:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.165817 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gqt5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686f15a0-53ce-4d3f-80e2-7d6272dc7d4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5501c17b8d8e321c7b94254ed053f943531df548575931c4ec091997d68572a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxlj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cc1c0cd69753ab441348667255f1dc34d4eae5c0579a0f84eb5d6063f7970d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxlj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:08:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gqt5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:15Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.184155 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b7225" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e5e536a-6797-4e6f-8160-1e23ddda1647\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07e7b4e6ae273aa9999ce9d0f198b8a9317611f11ddb313258aed23e3feff339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b7225\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:15Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.220323 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653394-4b4d-4c44-bc9d-39f2eeadbee4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e08c778826ca87eedf7169382d30509a5d31e132f5c91ff2cf633a24e3a7dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb226d8acfbc46b2a51a6c4ef5c04c1e17d99e9e82bad5950ccb4356fcc39eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c1d8b6512002b090f6fa191cc3dc7d55aeae6d135bca5df2c367fb2a4f68c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc4bf8d58b05d0044acc289a36a4eb6a4de51d5d0643239ff81fd7faff4531d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 04:07:50.127900 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 04:07:50.128059 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:07:50.128926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2923111326/tls.crt::/tmp/serving-cert-2923111326/tls.key\\\\\\\"\\\\nI1122 04:07:50.418529 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:07:50.432499 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:07:50.432593 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:07:50.432650 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:07:50.432686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:07:50.439773 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1122 04:07:50.439810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 04:07:50.439829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439834 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:07:50.439842 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:07:50.439844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:07:50.439864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:07:50.442112 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e25f8f28cc3aca76ae535aa6084bd1f994cbd0eb679f6ea40938a7fe456b0e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:15Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.241738 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:15Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.258879 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c43ee45b5065b7baee9b0025b5a73b4915b4577169a35be4378acf0e7cb603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:15Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.260557 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.260721 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.260741 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.260768 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.260785 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:15Z","lastTransitionTime":"2025-11-22T04:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.271826 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h6ndp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd066499-5bd5-459c-8a02-d02f716c8965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822e0ef5b78e9c1b19b56d52c7eed8ad0058cc30b405b2adf0e2a572afdaab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hhkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h6ndp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:15Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.285523 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bdbae2-706a-4f84-9f56-5a42aec77762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc56d58ec38fe2e6ff34afa44193fd165159799c6184b7f1474c8b13087f257f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191befb5ec1036276709a4720f3cd8c40d63d14818bed55c5fac998489233619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kjwnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:15Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.308930 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c858c4eaa869f479d0fbd62eadd41218ca8dddc7ae5ffd82d36977acde2e76ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:15Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.331479 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:15Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.347310 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pmtb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5af0f83551d8cf679ee04fbc3995afe66769f74480211fb104ebf2d6d0f9ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccx9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pmtb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:15Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.363976 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.364027 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.364044 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.364068 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.364085 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:15Z","lastTransitionTime":"2025-11-22T04:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.448146 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:08:15 crc kubenswrapper[4699]: E1122 04:08:15.448553 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.466791 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.466845 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.466863 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.466905 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.466923 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:15Z","lastTransitionTime":"2025-11-22T04:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.570328 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.570387 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.570398 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.570424 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.570464 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:15Z","lastTransitionTime":"2025-11-22T04:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.673758 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.673815 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.673834 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.673860 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.673880 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:15Z","lastTransitionTime":"2025-11-22T04:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.777587 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.777640 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.777657 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.777684 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.777704 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:15Z","lastTransitionTime":"2025-11-22T04:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.881792 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.882075 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.882144 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.882207 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.882266 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:15Z","lastTransitionTime":"2025-11-22T04:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.985364 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.985428 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.985465 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.985491 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:15 crc kubenswrapper[4699]: I1122 04:08:15.985505 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:15Z","lastTransitionTime":"2025-11-22T04:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.088502 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.088573 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.088590 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.088616 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.088633 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:16Z","lastTransitionTime":"2025-11-22T04:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.190760 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.191876 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.192083 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.192272 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.192481 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:16Z","lastTransitionTime":"2025-11-22T04:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.295800 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.295887 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.295913 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.295947 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.295973 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:16Z","lastTransitionTime":"2025-11-22T04:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.398986 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.399033 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.399058 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.399081 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.399095 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:16Z","lastTransitionTime":"2025-11-22T04:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.447782 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:08:16 crc kubenswrapper[4699]: E1122 04:08:16.447947 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.448185 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:08:16 crc kubenswrapper[4699]: E1122 04:08:16.448237 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.448733 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:08:16 crc kubenswrapper[4699]: E1122 04:08:16.448985 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.501968 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.502030 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.502048 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.502074 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.502095 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:16Z","lastTransitionTime":"2025-11-22T04:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.546419 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.546494 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.546504 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.546520 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.546532 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:16Z","lastTransitionTime":"2025-11-22T04:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:16 crc kubenswrapper[4699]: E1122 04:08:16.562051 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4852b328-c4f8-4280-9881-83927c94bf9a\\\",\\\"systemUUID\\\":\\\"76c96961-7d99-459e-9731-5ae805318244\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:16Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.566902 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.566964 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.566982 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.567007 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.567025 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:16Z","lastTransitionTime":"2025-11-22T04:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:16 crc kubenswrapper[4699]: E1122 04:08:16.582779 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4852b328-c4f8-4280-9881-83927c94bf9a\\\",\\\"systemUUID\\\":\\\"76c96961-7d99-459e-9731-5ae805318244\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:16Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.587751 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.587865 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.587884 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.587907 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.587924 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:16Z","lastTransitionTime":"2025-11-22T04:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:16 crc kubenswrapper[4699]: E1122 04:08:16.604321 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4852b328-c4f8-4280-9881-83927c94bf9a\\\",\\\"systemUUID\\\":\\\"76c96961-7d99-459e-9731-5ae805318244\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:16Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.609638 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.609673 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.609684 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.609704 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.609721 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:16Z","lastTransitionTime":"2025-11-22T04:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:16 crc kubenswrapper[4699]: E1122 04:08:16.622294 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4852b328-c4f8-4280-9881-83927c94bf9a\\\",\\\"systemUUID\\\":\\\"76c96961-7d99-459e-9731-5ae805318244\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:16Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.626405 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.626504 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.626523 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.626549 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.626567 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:16Z","lastTransitionTime":"2025-11-22T04:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:16 crc kubenswrapper[4699]: E1122 04:08:16.643268 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4852b328-c4f8-4280-9881-83927c94bf9a\\\",\\\"systemUUID\\\":\\\"76c96961-7d99-459e-9731-5ae805318244\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:16Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:16 crc kubenswrapper[4699]: E1122 04:08:16.643415 4699 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.645194 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.645232 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.645244 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.645262 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.645273 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:16Z","lastTransitionTime":"2025-11-22T04:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.748427 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.748555 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.748579 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.748608 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.748628 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:16Z","lastTransitionTime":"2025-11-22T04:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.850746 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.850782 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.850794 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.850808 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.850817 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:16Z","lastTransitionTime":"2025-11-22T04:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.953693 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.953730 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.953743 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.953760 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:16 crc kubenswrapper[4699]: I1122 04:08:16.953772 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:16Z","lastTransitionTime":"2025-11-22T04:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.057470 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.057562 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.057578 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.057600 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.057620 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:17Z","lastTransitionTime":"2025-11-22T04:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.161128 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.161190 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.161201 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.161221 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.161233 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:17Z","lastTransitionTime":"2025-11-22T04:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.264456 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.264530 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.264547 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.264601 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.264617 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:17Z","lastTransitionTime":"2025-11-22T04:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.368139 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.368204 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.368224 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.368253 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.368268 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:17Z","lastTransitionTime":"2025-11-22T04:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.447876 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:08:17 crc kubenswrapper[4699]: E1122 04:08:17.448101 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.475006 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.475199 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.475305 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.475391 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.475462 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:17Z","lastTransitionTime":"2025-11-22T04:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.577899 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.577934 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.577942 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.577975 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.577987 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:17Z","lastTransitionTime":"2025-11-22T04:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.653315 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.663880 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.672798 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e855881-4d77-4655-b4d7-a50fc081f993\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545a27e66130160ef1d8557458a64a27f18292c157e2e6dab9aa75aea0532ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35e9c8adb3bd9249f6d7e57cd40e40951af0463e49765ba635707120d07e8b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1538d20749062691aa2368004d22a46e612186aee24cb92acc3ddb073f616a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4a053080810e22083dda4eaba1155b7b547a214158f849f7e5778f2e37ccc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:17Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.680779 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.680810 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.680823 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.680841 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.680851 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:17Z","lastTransitionTime":"2025-11-22T04:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.700152 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b193b41e-aa0e-4816-b965-7b7873dadf85\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd4757f265f2b7a453efca645d83d5340e5ec206f6f9d40dd86010b90470498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1996517d6f55ae1765dd9d101fede2963e7ac51a406bca35cab95fa45192623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59408c7cd75594e068cdc4dadfec414fcc3d1604eea37ed708440fd1a4f019ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://516e9231111cee4a53c71bef07338222497c8ffb27edbfaddbcb2e58af61ae7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2097cbd81d5aedb02fafaae3f17840da75ab455e541c410ae2f70710548530ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:17Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.717266 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86ztb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d15248-9724-41b0-8370-66127cc18bbe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08e180e0857112708a5ca84fc45cd41b9aebc5eef5628d5666abc590d86242e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-799vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86ztb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:17Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.743386 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bc11f794671091f44b26888a6b2e95b17d76dec770be187a8ce9cea8c7c9688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bc11f794671091f44b26888a6b2e95b17d76dec770be187a8ce9cea8c7c9688\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"message\\\":\\\"cs/network-check-target for network=default are: map[]\\\\nI1122 04:08:05.185222 6150 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1122 04:08:05.185233 6150 services_controller.go:443] Built service openshift-network-diagnostics/network-check-target LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.219\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:80, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1122 04:08:05.185246 6150 services_controller.go:444] Built service openshift-network-diagnostics/network-check-target LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1122 04:08:05.185254 6150 services_controller.go:445] Built service openshift-network-diagnostics/network-check-target LB template configs for network=default: []services.lbConfig(nil)\\\\nF1122 04:08:05.185278 6150 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not ad\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:08:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-z7552_openshift-ovn-kubernetes(fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z7552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:17Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.756611 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gqt5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686f15a0-53ce-4d3f-80e2-7d6272dc7d4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5501c17b8d8e321c7b94254ed053f943531df548575931c4ec091997d68572a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxlj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cc1c0cd69753ab441348667255f1dc34d4eae5c0579a0f84eb5d6063f7970d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxlj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:08:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gqt5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:17Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.771562 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bdbae2-706a-4f84-9f56-5a42aec77762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc56d58ec38fe2e6ff34afa44193fd165159799c6184b7f1474c8b13087f257f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191befb5ec1036276709a4720f3cd8c40d63d14818bed55c5fac998489233619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kjwnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:17Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.783115 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.783163 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.783173 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.783202 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.783214 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:17Z","lastTransitionTime":"2025-11-22T04:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.789546 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b7225" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e5e536a-6797-4e6f-8160-1e23ddda1647\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07e7b4e6ae273aa9999ce9d0f198b8a9317611f11ddb313258aed23e3feff339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b7225\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:17Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.807687 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653394-4b4d-4c44-bc9d-39f2eeadbee4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e08c778826ca87eedf7169382d30509a5d31e132f5c91ff2cf633a24e3a7dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb226d8acfbc46b2a51a6c4ef5c04c1e17d99e9e82bad5950ccb4356fcc39eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c1d8b6512002b090f6fa191cc3dc7d55aeae6d135bca5df2c367fb2a4f68c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc4bf8d58b05d0044acc289a36a4eb6a4de51d5d0643239ff81fd7faff4531d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 04:07:50.127900 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 04:07:50.128059 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:07:50.128926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2923111326/tls.crt::/tmp/serving-cert-2923111326/tls.key\\\\\\\"\\\\nI1122 04:07:50.418529 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:07:50.432499 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:07:50.432593 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:07:50.432650 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:07:50.432686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:07:50.439773 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1122 04:07:50.439810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 04:07:50.439829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439834 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:07:50.439842 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:07:50.439844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:07:50.439864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:07:50.442112 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e25f8f28cc3aca76ae535aa6084bd1f994cbd0eb679f6ea40938a7fe456b0e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:17Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.828835 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:17Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.855309 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c43ee45b5065b7baee9b0025b5a73b4915b4577169a35be4378acf0e7cb603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:17Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.880797 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h6ndp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd066499-5bd5-459c-8a02-d02f716c8965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822e0ef5b78e9c1b19b56d52c7eed8ad0058cc30b405b2adf0e2a572afdaab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hhkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h6ndp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:17Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.891211 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.891247 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.891257 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.891274 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.891284 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:17Z","lastTransitionTime":"2025-11-22T04:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.901836 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c858c4eaa869f479d0fbd62eadd41218ca8dddc7ae5ffd82d36977acde2e76ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:17Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.917966 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:17Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.932048 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pmtb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5af0f83551d8cf679ee04fbc3995afe66769f74480211fb104ebf2d6d0f9ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccx9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pmtb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:17Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.945410 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:17Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.959331 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bfcbb63b703f8f023d54028af9011b37da8d2f7c9ac57e35129cd783f301876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99bfafe09aabfb9e3715d3c7af12849e0c8cb66e5799011c8463c5043383fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:17Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.972567 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pj52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82be5d0c-6f95-43e4-aa3c-9c56de3e200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:08:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pj52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:17Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.994378 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.994419 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.994466 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.994490 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:17 crc kubenswrapper[4699]: I1122 04:08:17.994504 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:17Z","lastTransitionTime":"2025-11-22T04:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:18 crc kubenswrapper[4699]: I1122 04:08:18.098312 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:18 crc kubenswrapper[4699]: I1122 04:08:18.098402 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:18 crc kubenswrapper[4699]: I1122 04:08:18.098426 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:18 crc kubenswrapper[4699]: I1122 04:08:18.098510 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:18 crc kubenswrapper[4699]: I1122 04:08:18.098533 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:18Z","lastTransitionTime":"2025-11-22T04:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:18 crc kubenswrapper[4699]: I1122 04:08:18.201813 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:18 crc kubenswrapper[4699]: I1122 04:08:18.202251 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:18 crc kubenswrapper[4699]: I1122 04:08:18.202327 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:18 crc kubenswrapper[4699]: I1122 04:08:18.202396 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:18 crc kubenswrapper[4699]: I1122 04:08:18.202485 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:18Z","lastTransitionTime":"2025-11-22T04:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:18 crc kubenswrapper[4699]: I1122 04:08:18.305920 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:18 crc kubenswrapper[4699]: I1122 04:08:18.306004 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:18 crc kubenswrapper[4699]: I1122 04:08:18.306021 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:18 crc kubenswrapper[4699]: I1122 04:08:18.306046 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:18 crc kubenswrapper[4699]: I1122 04:08:18.306065 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:18Z","lastTransitionTime":"2025-11-22T04:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:18 crc kubenswrapper[4699]: I1122 04:08:18.409391 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:18 crc kubenswrapper[4699]: I1122 04:08:18.409834 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:18 crc kubenswrapper[4699]: I1122 04:08:18.409900 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:18 crc kubenswrapper[4699]: I1122 04:08:18.409983 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:18 crc kubenswrapper[4699]: I1122 04:08:18.410057 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:18Z","lastTransitionTime":"2025-11-22T04:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:18 crc kubenswrapper[4699]: I1122 04:08:18.446877 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:08:18 crc kubenswrapper[4699]: I1122 04:08:18.446978 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:08:18 crc kubenswrapper[4699]: E1122 04:08:18.447047 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:08:18 crc kubenswrapper[4699]: E1122 04:08:18.447305 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:08:18 crc kubenswrapper[4699]: I1122 04:08:18.447773 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:08:18 crc kubenswrapper[4699]: E1122 04:08:18.447918 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:08:18 crc kubenswrapper[4699]: I1122 04:08:18.549972 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:18 crc kubenswrapper[4699]: I1122 04:08:18.550025 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:18 crc kubenswrapper[4699]: I1122 04:08:18.550033 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:18 crc kubenswrapper[4699]: I1122 04:08:18.550051 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:18 crc kubenswrapper[4699]: I1122 04:08:18.550062 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:18Z","lastTransitionTime":"2025-11-22T04:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:18 crc kubenswrapper[4699]: I1122 04:08:18.654628 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:18 crc kubenswrapper[4699]: I1122 04:08:18.654696 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:18 crc kubenswrapper[4699]: I1122 04:08:18.654721 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:18 crc kubenswrapper[4699]: I1122 04:08:18.654755 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:18 crc kubenswrapper[4699]: I1122 04:08:18.654785 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:18Z","lastTransitionTime":"2025-11-22T04:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:18 crc kubenswrapper[4699]: I1122 04:08:18.758105 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:18 crc kubenswrapper[4699]: I1122 04:08:18.758155 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:18 crc kubenswrapper[4699]: I1122 04:08:18.758172 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:18 crc kubenswrapper[4699]: I1122 04:08:18.758232 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:18 crc kubenswrapper[4699]: I1122 04:08:18.758249 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:18Z","lastTransitionTime":"2025-11-22T04:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:18 crc kubenswrapper[4699]: I1122 04:08:18.861152 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:18 crc kubenswrapper[4699]: I1122 04:08:18.861201 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:18 crc kubenswrapper[4699]: I1122 04:08:18.861211 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:18 crc kubenswrapper[4699]: I1122 04:08:18.861227 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:18 crc kubenswrapper[4699]: I1122 04:08:18.861239 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:18Z","lastTransitionTime":"2025-11-22T04:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:18 crc kubenswrapper[4699]: I1122 04:08:18.963216 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:18 crc kubenswrapper[4699]: I1122 04:08:18.963281 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:18 crc kubenswrapper[4699]: I1122 04:08:18.963297 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:18 crc kubenswrapper[4699]: I1122 04:08:18.963320 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:18 crc kubenswrapper[4699]: I1122 04:08:18.963337 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:18Z","lastTransitionTime":"2025-11-22T04:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.067813 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.067950 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.067971 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.068003 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.068067 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:19Z","lastTransitionTime":"2025-11-22T04:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.171726 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.171782 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.171794 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.171816 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.171828 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:19Z","lastTransitionTime":"2025-11-22T04:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.275524 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.275568 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.275584 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.275609 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.275628 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:19Z","lastTransitionTime":"2025-11-22T04:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.378767 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.378821 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.378831 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.378853 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.378869 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:19Z","lastTransitionTime":"2025-11-22T04:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.447106 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:08:19 crc kubenswrapper[4699]: E1122 04:08:19.447279 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.470071 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:19Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.481759 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.481799 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.481809 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.481826 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.481836 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:19Z","lastTransitionTime":"2025-11-22T04:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.488686 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bfcbb63b703f8f023d54028af9011b37da8d2f7c9ac57e35129cd783f301876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99bfafe09aabfb9e3715d3c7af12849e0c8cb66e5799011c8463c5043383fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:19Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.503535 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pj52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82be5d0c-6f95-43e4-aa3c-9c56de3e200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:08:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pj52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:19Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.525484 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e855881-4d77-4655-b4d7-a50fc081f993\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545a27e66130160ef1d8557458a64a27f18292c157e2e6dab9aa75aea0532ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35e9c8adb3bd9249f6d7e57cd40e40951af0463e49765ba635707120d07e8b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1538d20749062691aa2368004d22a46e612186aee24cb92acc3ddb073f616a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4a053080810e22083dda4eaba1155b7b547a214158f849f7e5778f2e37ccc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:19Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.551927 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b193b41e-aa0e-4816-b965-7b7873dadf85\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd4757f265f2b7a453efca645d83d5340e5ec206f6f9d40dd86010b90470498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1996517d6f55ae1765dd9d101fede2963e7ac51a406bca35cab95fa45192623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59408c7cd75594e068cdc4dadfec414fcc3d1604eea37ed708440fd1a4f019ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://516e9231111cee4a53c71bef07338222497c8ffb27edbfaddbcb2e58af61ae7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2097cbd81d5aedb02fafaae3f17840da75ab455e541c410ae2f70710548530ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:19Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.564371 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86ztb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d15248-9724-41b0-8370-66127cc18bbe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08e180e0857112708a5ca84fc45cd41b9aebc5eef5628d5666abc590d86242e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-799vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86ztb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:19Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.587452 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.587497 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.587510 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.587528 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.587538 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:19Z","lastTransitionTime":"2025-11-22T04:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.593544 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bc11f794671091f44b26888a6b2e95b17d76dec770be187a8ce9cea8c7c9688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bc11f794671091f44b26888a6b2e95b17d76dec770be187a8ce9cea8c7c9688\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"message\\\":\\\"cs/network-check-target for network=default are: map[]\\\\nI1122 04:08:05.185222 6150 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1122 04:08:05.185233 6150 services_controller.go:443] Built service openshift-network-diagnostics/network-check-target LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.219\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:80, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1122 04:08:05.185246 6150 services_controller.go:444] Built service openshift-network-diagnostics/network-check-target LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1122 04:08:05.185254 6150 services_controller.go:445] Built service openshift-network-diagnostics/network-check-target LB template configs for network=default: []services.lbConfig(nil)\\\\nF1122 04:08:05.185278 6150 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not ad\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:08:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-z7552_openshift-ovn-kubernetes(fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z7552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:19Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.611057 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gqt5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686f15a0-53ce-4d3f-80e2-7d6272dc7d4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5501c17b8d8e321c7b94254ed053f943531df548575931c4ec091997d68572a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxlj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cc1c0cd69753ab441348667255f1dc34d4eae5c0579a0f84eb5d6063f7970d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxlj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:08:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gqt5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:19Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.652065 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b7225" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e5e536a-6797-4e6f-8160-1e23ddda1647\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07e7b4e6ae273aa9999ce9d0f198b8a9317611f11ddb313258aed23e3feff339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b7225\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:19Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.676825 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653394-4b4d-4c44-bc9d-39f2eeadbee4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e08c778826ca87eedf7169382d30509a5d31e132f5c91ff2cf633a24e3a7dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb226d8acfbc46b2a51a6c4ef5c04c1e17d99e9e82bad5950ccb4356fcc39eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c1d8b6512002b090f6fa191cc3dc7d55aeae6d135bca5df2c367fb2a4f68c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc4bf8d58b05d0044acc289a36a4eb6a4de51d5d0643239ff81fd7faff4531d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 04:07:50.127900 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 04:07:50.128059 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:07:50.128926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2923111326/tls.crt::/tmp/serving-cert-2923111326/tls.key\\\\\\\"\\\\nI1122 04:07:50.418529 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:07:50.432499 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:07:50.432593 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:07:50.432650 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:07:50.432686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:07:50.439773 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1122 04:07:50.439810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 04:07:50.439829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439834 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:07:50.439842 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:07:50.439844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:07:50.439864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:07:50.442112 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e25f8f28cc3aca76ae535aa6084bd1f994cbd0eb679f6ea40938a7fe456b0e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:19Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.689683 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.689715 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.689724 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.689740 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.689749 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:19Z","lastTransitionTime":"2025-11-22T04:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.692949 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:19Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.707576 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c43ee45b5065b7baee9b0025b5a73b4915b4577169a35be4378acf0e7cb603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:19Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.721059 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h6ndp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd066499-5bd5-459c-8a02-d02f716c8965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822e0ef5b78e9c1b19b56d52c7eed8ad0058cc30b405b2adf0e2a572afdaab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hhkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h6ndp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:19Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.733499 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bdbae2-706a-4f84-9f56-5a42aec77762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc56d58ec38fe2e6ff34afa44193fd165159799c6184b7f1474c8b13087f257f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191befb5ec1036276709a4720f3cd8c40d63d14818bed55c5fac998489233619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kjwnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:19Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.748214 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fe0275b-9174-4aab-9f0f-7c00a233de69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3141a4a35fe91db661f1bbb69f481d1db9302e79a16e9bc2898f2fd5fbe0f445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e418cb4f331bd30b224110514a5d766e31fd949210ed6eb5ea3e1e04b2f62d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb5d783b1e21eb55efe9affd3962651d2bc2f2345954fa40a00e5f9b481066fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6860d3b5c86b1ad3bd55fc98a44e7fd84d66a5237df59f47319f598420b0241f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6860d3b5c86b1ad3bd55fc98a44e7fd84d66a5237df59f47319f598420b0241f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:19Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.765404 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c858c4eaa869f479d0fbd62eadd41218ca8dddc7ae5ffd82d36977acde2e76ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:19Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.781746 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:19Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.793001 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.793065 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.793079 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.793105 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.793122 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:19Z","lastTransitionTime":"2025-11-22T04:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.798189 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pmtb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5af0f83551d8cf679ee04fbc3995afe66769f74480211fb104ebf2d6d0f9ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccx9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pmtb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:19Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.895782 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.895851 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.895869 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.895900 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.895918 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:19Z","lastTransitionTime":"2025-11-22T04:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.999254 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.999311 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.999323 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.999343 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:19 crc kubenswrapper[4699]: I1122 04:08:19.999354 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:19Z","lastTransitionTime":"2025-11-22T04:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:20 crc kubenswrapper[4699]: I1122 04:08:20.102416 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:20 crc kubenswrapper[4699]: I1122 04:08:20.102514 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:20 crc kubenswrapper[4699]: I1122 04:08:20.102533 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:20 crc kubenswrapper[4699]: I1122 04:08:20.102560 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:20 crc kubenswrapper[4699]: I1122 04:08:20.102584 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:20Z","lastTransitionTime":"2025-11-22T04:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:20 crc kubenswrapper[4699]: I1122 04:08:20.206195 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:20 crc kubenswrapper[4699]: I1122 04:08:20.206260 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:20 crc kubenswrapper[4699]: I1122 04:08:20.206271 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:20 crc kubenswrapper[4699]: I1122 04:08:20.206289 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:20 crc kubenswrapper[4699]: I1122 04:08:20.206302 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:20Z","lastTransitionTime":"2025-11-22T04:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:20 crc kubenswrapper[4699]: I1122 04:08:20.310647 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:20 crc kubenswrapper[4699]: I1122 04:08:20.310690 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:20 crc kubenswrapper[4699]: I1122 04:08:20.310702 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:20 crc kubenswrapper[4699]: I1122 04:08:20.310722 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:20 crc kubenswrapper[4699]: I1122 04:08:20.310745 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:20Z","lastTransitionTime":"2025-11-22T04:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:20 crc kubenswrapper[4699]: I1122 04:08:20.413342 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:20 crc kubenswrapper[4699]: I1122 04:08:20.413383 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:20 crc kubenswrapper[4699]: I1122 04:08:20.413393 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:20 crc kubenswrapper[4699]: I1122 04:08:20.413409 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:20 crc kubenswrapper[4699]: I1122 04:08:20.413418 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:20Z","lastTransitionTime":"2025-11-22T04:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:20 crc kubenswrapper[4699]: I1122 04:08:20.447173 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:08:20 crc kubenswrapper[4699]: I1122 04:08:20.447267 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:08:20 crc kubenswrapper[4699]: I1122 04:08:20.447198 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:08:20 crc kubenswrapper[4699]: E1122 04:08:20.447392 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:08:20 crc kubenswrapper[4699]: E1122 04:08:20.447562 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:08:20 crc kubenswrapper[4699]: E1122 04:08:20.447675 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:08:20 crc kubenswrapper[4699]: I1122 04:08:20.518579 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:20 crc kubenswrapper[4699]: I1122 04:08:20.518648 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:20 crc kubenswrapper[4699]: I1122 04:08:20.518662 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:20 crc kubenswrapper[4699]: I1122 04:08:20.518688 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:20 crc kubenswrapper[4699]: I1122 04:08:20.518703 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:20Z","lastTransitionTime":"2025-11-22T04:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:20 crc kubenswrapper[4699]: I1122 04:08:20.622542 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:20 crc kubenswrapper[4699]: I1122 04:08:20.622600 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:20 crc kubenswrapper[4699]: I1122 04:08:20.622615 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:20 crc kubenswrapper[4699]: I1122 04:08:20.622638 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:20 crc kubenswrapper[4699]: I1122 04:08:20.622658 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:20Z","lastTransitionTime":"2025-11-22T04:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:20 crc kubenswrapper[4699]: I1122 04:08:20.724950 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:20 crc kubenswrapper[4699]: I1122 04:08:20.724987 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:20 crc kubenswrapper[4699]: I1122 04:08:20.724998 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:20 crc kubenswrapper[4699]: I1122 04:08:20.725015 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:20 crc kubenswrapper[4699]: I1122 04:08:20.725027 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:20Z","lastTransitionTime":"2025-11-22T04:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:20 crc kubenswrapper[4699]: I1122 04:08:20.827450 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:20 crc kubenswrapper[4699]: I1122 04:08:20.827489 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:20 crc kubenswrapper[4699]: I1122 04:08:20.827500 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:20 crc kubenswrapper[4699]: I1122 04:08:20.827515 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:20 crc kubenswrapper[4699]: I1122 04:08:20.827526 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:20Z","lastTransitionTime":"2025-11-22T04:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:20 crc kubenswrapper[4699]: I1122 04:08:20.929929 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:20 crc kubenswrapper[4699]: I1122 04:08:20.929977 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:20 crc kubenswrapper[4699]: I1122 04:08:20.929986 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:20 crc kubenswrapper[4699]: I1122 04:08:20.930004 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:20 crc kubenswrapper[4699]: I1122 04:08:20.930016 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:20Z","lastTransitionTime":"2025-11-22T04:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.033067 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.033134 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.033145 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.033164 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.033175 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:21Z","lastTransitionTime":"2025-11-22T04:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.136346 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.136410 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.136424 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.136491 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.136506 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:21Z","lastTransitionTime":"2025-11-22T04:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.221974 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/82be5d0c-6f95-43e4-aa3c-9c56de3e200c-metrics-certs\") pod \"network-metrics-daemon-pj52w\" (UID: \"82be5d0c-6f95-43e4-aa3c-9c56de3e200c\") " pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:08:21 crc kubenswrapper[4699]: E1122 04:08:21.222215 4699 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 04:08:21 crc kubenswrapper[4699]: E1122 04:08:21.222301 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82be5d0c-6f95-43e4-aa3c-9c56de3e200c-metrics-certs podName:82be5d0c-6f95-43e4-aa3c-9c56de3e200c nodeName:}" failed. No retries permitted until 2025-11-22 04:08:37.222277139 +0000 UTC m=+68.564898316 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/82be5d0c-6f95-43e4-aa3c-9c56de3e200c-metrics-certs") pod "network-metrics-daemon-pj52w" (UID: "82be5d0c-6f95-43e4-aa3c-9c56de3e200c") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.239706 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.239763 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.239781 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.239804 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.239823 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:21Z","lastTransitionTime":"2025-11-22T04:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.342921 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.342977 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.342986 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.343005 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.343018 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:21Z","lastTransitionTime":"2025-11-22T04:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.445728 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.445789 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.445803 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.445823 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.445837 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:21Z","lastTransitionTime":"2025-11-22T04:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.447385 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:08:21 crc kubenswrapper[4699]: E1122 04:08:21.447520 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.549404 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.549499 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.549509 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.549527 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.549540 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:21Z","lastTransitionTime":"2025-11-22T04:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.652316 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.652390 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.652421 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.652475 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.652494 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:21Z","lastTransitionTime":"2025-11-22T04:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.755234 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.755279 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.755289 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.755303 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.755314 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:21Z","lastTransitionTime":"2025-11-22T04:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.858648 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.858692 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.858705 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.858722 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.858735 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:21Z","lastTransitionTime":"2025-11-22T04:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.962009 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.962068 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.962081 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.962100 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:21 crc kubenswrapper[4699]: I1122 04:08:21.962113 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:21Z","lastTransitionTime":"2025-11-22T04:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.065193 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.065288 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.065300 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.065315 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.065325 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:22Z","lastTransitionTime":"2025-11-22T04:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.171068 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.171117 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.171134 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.171153 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.171167 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:22Z","lastTransitionTime":"2025-11-22T04:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.233624 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.233804 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:08:22 crc kubenswrapper[4699]: E1122 04:08:22.233897 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:08:54.233855385 +0000 UTC m=+85.576476572 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:08:22 crc kubenswrapper[4699]: E1122 04:08:22.233904 4699 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.233954 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:08:22 crc kubenswrapper[4699]: E1122 04:08:22.233993 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 04:08:54.233985858 +0000 UTC m=+85.576607035 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 04:08:22 crc kubenswrapper[4699]: E1122 04:08:22.234070 4699 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 04:08:22 crc kubenswrapper[4699]: E1122 04:08:22.234142 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 04:08:54.234118472 +0000 UTC m=+85.576739779 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.273848 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.273900 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.273910 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.273930 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.273941 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:22Z","lastTransitionTime":"2025-11-22T04:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.334880 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.334927 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:08:22 crc kubenswrapper[4699]: E1122 04:08:22.335083 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 04:08:22 crc kubenswrapper[4699]: E1122 04:08:22.335102 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 04:08:22 crc kubenswrapper[4699]: E1122 04:08:22.335116 4699 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:08:22 crc kubenswrapper[4699]: E1122 04:08:22.335179 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-22 04:08:54.335160289 +0000 UTC m=+85.677781476 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:08:22 crc kubenswrapper[4699]: E1122 04:08:22.335317 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 04:08:22 crc kubenswrapper[4699]: E1122 04:08:22.335393 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 04:08:22 crc kubenswrapper[4699]: E1122 04:08:22.335422 4699 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:08:22 crc kubenswrapper[4699]: E1122 04:08:22.335567 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-22 04:08:54.335535879 +0000 UTC m=+85.678157096 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.377234 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.377294 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.377304 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.377317 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.377326 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:22Z","lastTransitionTime":"2025-11-22T04:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.447026 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.447117 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:08:22 crc kubenswrapper[4699]: E1122 04:08:22.447218 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.447036 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:08:22 crc kubenswrapper[4699]: E1122 04:08:22.447384 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:08:22 crc kubenswrapper[4699]: E1122 04:08:22.447514 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.481754 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.481815 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.481827 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.481848 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.481860 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:22Z","lastTransitionTime":"2025-11-22T04:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.584775 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.584856 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.584869 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.584889 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.584904 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:22Z","lastTransitionTime":"2025-11-22T04:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.688163 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.688241 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.688260 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.688287 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.688306 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:22Z","lastTransitionTime":"2025-11-22T04:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.791668 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.791724 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.791734 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.791829 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.791845 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:22Z","lastTransitionTime":"2025-11-22T04:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.894934 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.895007 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.895030 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.895063 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.895089 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:22Z","lastTransitionTime":"2025-11-22T04:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.997349 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.997456 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.997481 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.997513 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:22 crc kubenswrapper[4699]: I1122 04:08:22.997536 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:22Z","lastTransitionTime":"2025-11-22T04:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:23 crc kubenswrapper[4699]: I1122 04:08:23.100478 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:23 crc kubenswrapper[4699]: I1122 04:08:23.100554 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:23 crc kubenswrapper[4699]: I1122 04:08:23.100572 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:23 crc kubenswrapper[4699]: I1122 04:08:23.100597 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:23 crc kubenswrapper[4699]: I1122 04:08:23.100615 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:23Z","lastTransitionTime":"2025-11-22T04:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:23 crc kubenswrapper[4699]: I1122 04:08:23.204965 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:23 crc kubenswrapper[4699]: I1122 04:08:23.205027 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:23 crc kubenswrapper[4699]: I1122 04:08:23.205044 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:23 crc kubenswrapper[4699]: I1122 04:08:23.205070 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:23 crc kubenswrapper[4699]: I1122 04:08:23.205088 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:23Z","lastTransitionTime":"2025-11-22T04:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:23 crc kubenswrapper[4699]: I1122 04:08:23.307734 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:23 crc kubenswrapper[4699]: I1122 04:08:23.307782 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:23 crc kubenswrapper[4699]: I1122 04:08:23.307793 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:23 crc kubenswrapper[4699]: I1122 04:08:23.307811 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:23 crc kubenswrapper[4699]: I1122 04:08:23.307821 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:23Z","lastTransitionTime":"2025-11-22T04:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:23 crc kubenswrapper[4699]: I1122 04:08:23.411847 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:23 crc kubenswrapper[4699]: I1122 04:08:23.411909 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:23 crc kubenswrapper[4699]: I1122 04:08:23.411928 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:23 crc kubenswrapper[4699]: I1122 04:08:23.411956 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:23 crc kubenswrapper[4699]: I1122 04:08:23.411981 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:23Z","lastTransitionTime":"2025-11-22T04:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:23 crc kubenswrapper[4699]: I1122 04:08:23.447766 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:08:23 crc kubenswrapper[4699]: E1122 04:08:23.448010 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:08:23 crc kubenswrapper[4699]: I1122 04:08:23.514779 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:23 crc kubenswrapper[4699]: I1122 04:08:23.514843 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:23 crc kubenswrapper[4699]: I1122 04:08:23.514860 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:23 crc kubenswrapper[4699]: I1122 04:08:23.514890 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:23 crc kubenswrapper[4699]: I1122 04:08:23.514911 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:23Z","lastTransitionTime":"2025-11-22T04:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:23 crc kubenswrapper[4699]: I1122 04:08:23.617666 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:23 crc kubenswrapper[4699]: I1122 04:08:23.617719 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:23 crc kubenswrapper[4699]: I1122 04:08:23.617736 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:23 crc kubenswrapper[4699]: I1122 04:08:23.617763 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:23 crc kubenswrapper[4699]: I1122 04:08:23.617780 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:23Z","lastTransitionTime":"2025-11-22T04:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:23 crc kubenswrapper[4699]: I1122 04:08:23.721064 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:23 crc kubenswrapper[4699]: I1122 04:08:23.721174 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:23 crc kubenswrapper[4699]: I1122 04:08:23.721194 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:23 crc kubenswrapper[4699]: I1122 04:08:23.721220 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:23 crc kubenswrapper[4699]: I1122 04:08:23.721237 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:23Z","lastTransitionTime":"2025-11-22T04:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:23 crc kubenswrapper[4699]: I1122 04:08:23.824343 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:23 crc kubenswrapper[4699]: I1122 04:08:23.824421 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:23 crc kubenswrapper[4699]: I1122 04:08:23.824473 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:23 crc kubenswrapper[4699]: I1122 04:08:23.824504 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:23 crc kubenswrapper[4699]: I1122 04:08:23.824523 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:23Z","lastTransitionTime":"2025-11-22T04:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:23 crc kubenswrapper[4699]: I1122 04:08:23.927212 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:23 crc kubenswrapper[4699]: I1122 04:08:23.927294 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:23 crc kubenswrapper[4699]: I1122 04:08:23.927308 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:23 crc kubenswrapper[4699]: I1122 04:08:23.927327 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:23 crc kubenswrapper[4699]: I1122 04:08:23.927358 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:23Z","lastTransitionTime":"2025-11-22T04:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.030414 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.030505 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.030523 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.030553 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.030573 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:24Z","lastTransitionTime":"2025-11-22T04:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.134693 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.134756 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.134784 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.134811 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.134837 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:24Z","lastTransitionTime":"2025-11-22T04:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.238096 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.238181 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.238205 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.238249 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.238276 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:24Z","lastTransitionTime":"2025-11-22T04:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.341991 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.342048 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.342254 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.342275 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.342287 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:24Z","lastTransitionTime":"2025-11-22T04:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.444972 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.445040 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.445064 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.445089 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.445108 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:24Z","lastTransitionTime":"2025-11-22T04:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.447254 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.447281 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.447281 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:08:24 crc kubenswrapper[4699]: E1122 04:08:24.447419 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:08:24 crc kubenswrapper[4699]: E1122 04:08:24.447535 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:08:24 crc kubenswrapper[4699]: E1122 04:08:24.447626 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.548551 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.548642 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.548670 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.548710 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.548751 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:24Z","lastTransitionTime":"2025-11-22T04:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.656897 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.656953 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.656968 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.656991 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.657006 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:24Z","lastTransitionTime":"2025-11-22T04:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.760183 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.760256 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.760277 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.760305 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.760324 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:24Z","lastTransitionTime":"2025-11-22T04:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.863711 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.863769 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.863786 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.863813 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.863830 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:24Z","lastTransitionTime":"2025-11-22T04:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.966982 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.967398 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.967698 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.967915 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:24 crc kubenswrapper[4699]: I1122 04:08:24.968117 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:24Z","lastTransitionTime":"2025-11-22T04:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:25 crc kubenswrapper[4699]: I1122 04:08:25.071016 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:25 crc kubenswrapper[4699]: I1122 04:08:25.071066 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:25 crc kubenswrapper[4699]: I1122 04:08:25.071076 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:25 crc kubenswrapper[4699]: I1122 04:08:25.071099 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:25 crc kubenswrapper[4699]: I1122 04:08:25.071114 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:25Z","lastTransitionTime":"2025-11-22T04:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:25 crc kubenswrapper[4699]: I1122 04:08:25.174390 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:25 crc kubenswrapper[4699]: I1122 04:08:25.174517 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:25 crc kubenswrapper[4699]: I1122 04:08:25.174528 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:25 crc kubenswrapper[4699]: I1122 04:08:25.174545 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:25 crc kubenswrapper[4699]: I1122 04:08:25.174556 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:25Z","lastTransitionTime":"2025-11-22T04:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:25 crc kubenswrapper[4699]: I1122 04:08:25.278634 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:25 crc kubenswrapper[4699]: I1122 04:08:25.278699 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:25 crc kubenswrapper[4699]: I1122 04:08:25.278718 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:25 crc kubenswrapper[4699]: I1122 04:08:25.278738 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:25 crc kubenswrapper[4699]: I1122 04:08:25.278752 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:25Z","lastTransitionTime":"2025-11-22T04:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:25 crc kubenswrapper[4699]: I1122 04:08:25.382914 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:25 crc kubenswrapper[4699]: I1122 04:08:25.383004 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:25 crc kubenswrapper[4699]: I1122 04:08:25.383028 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:25 crc kubenswrapper[4699]: I1122 04:08:25.383064 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:25 crc kubenswrapper[4699]: I1122 04:08:25.383082 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:25Z","lastTransitionTime":"2025-11-22T04:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:25 crc kubenswrapper[4699]: I1122 04:08:25.447528 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:08:25 crc kubenswrapper[4699]: E1122 04:08:25.447743 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:08:25 crc kubenswrapper[4699]: I1122 04:08:25.486004 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:25 crc kubenswrapper[4699]: I1122 04:08:25.486044 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:25 crc kubenswrapper[4699]: I1122 04:08:25.486056 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:25 crc kubenswrapper[4699]: I1122 04:08:25.486075 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:25 crc kubenswrapper[4699]: I1122 04:08:25.486090 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:25Z","lastTransitionTime":"2025-11-22T04:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:25 crc kubenswrapper[4699]: I1122 04:08:25.589493 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:25 crc kubenswrapper[4699]: I1122 04:08:25.589563 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:25 crc kubenswrapper[4699]: I1122 04:08:25.589577 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:25 crc kubenswrapper[4699]: I1122 04:08:25.589601 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:25 crc kubenswrapper[4699]: I1122 04:08:25.589617 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:25Z","lastTransitionTime":"2025-11-22T04:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:25 crc kubenswrapper[4699]: I1122 04:08:25.693251 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:25 crc kubenswrapper[4699]: I1122 04:08:25.693295 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:25 crc kubenswrapper[4699]: I1122 04:08:25.693313 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:25 crc kubenswrapper[4699]: I1122 04:08:25.693338 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:25 crc kubenswrapper[4699]: I1122 04:08:25.693354 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:25Z","lastTransitionTime":"2025-11-22T04:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:25 crc kubenswrapper[4699]: I1122 04:08:25.798504 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:25 crc kubenswrapper[4699]: I1122 04:08:25.798574 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:25 crc kubenswrapper[4699]: I1122 04:08:25.798597 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:25 crc kubenswrapper[4699]: I1122 04:08:25.798623 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:25 crc kubenswrapper[4699]: I1122 04:08:25.798642 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:25Z","lastTransitionTime":"2025-11-22T04:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:25 crc kubenswrapper[4699]: I1122 04:08:25.902394 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:25 crc kubenswrapper[4699]: I1122 04:08:25.902473 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:25 crc kubenswrapper[4699]: I1122 04:08:25.902485 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:25 crc kubenswrapper[4699]: I1122 04:08:25.902507 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:25 crc kubenswrapper[4699]: I1122 04:08:25.902519 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:25Z","lastTransitionTime":"2025-11-22T04:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.005459 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.005511 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.005521 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.005541 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.005557 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:26Z","lastTransitionTime":"2025-11-22T04:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.108714 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.108778 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.108794 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.108819 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.108836 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:26Z","lastTransitionTime":"2025-11-22T04:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.212389 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.212487 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.212501 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.212521 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.212533 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:26Z","lastTransitionTime":"2025-11-22T04:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.315861 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.315904 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.315912 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.315930 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.315942 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:26Z","lastTransitionTime":"2025-11-22T04:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.419947 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.420456 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.420643 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.420796 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.420940 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:26Z","lastTransitionTime":"2025-11-22T04:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.447697 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.447808 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:08:26 crc kubenswrapper[4699]: E1122 04:08:26.447876 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:08:26 crc kubenswrapper[4699]: E1122 04:08:26.448016 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.448366 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:08:26 crc kubenswrapper[4699]: E1122 04:08:26.448653 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.524542 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.524616 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.524644 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.524683 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.524708 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:26Z","lastTransitionTime":"2025-11-22T04:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.627637 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.627720 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.627751 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.627791 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.627815 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:26Z","lastTransitionTime":"2025-11-22T04:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.731375 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.731453 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.731468 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.731487 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.731504 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:26Z","lastTransitionTime":"2025-11-22T04:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.834215 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.834277 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.834329 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.834360 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.834382 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:26Z","lastTransitionTime":"2025-11-22T04:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.918110 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.918167 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.918177 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.918198 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.918212 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:26Z","lastTransitionTime":"2025-11-22T04:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:26 crc kubenswrapper[4699]: E1122 04:08:26.931595 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4852b328-c4f8-4280-9881-83927c94bf9a\\\",\\\"systemUUID\\\":\\\"76c96961-7d99-459e-9731-5ae805318244\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:26Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.936578 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.936614 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.936630 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.936652 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.936665 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:26Z","lastTransitionTime":"2025-11-22T04:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:26 crc kubenswrapper[4699]: E1122 04:08:26.950189 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4852b328-c4f8-4280-9881-83927c94bf9a\\\",\\\"systemUUID\\\":\\\"76c96961-7d99-459e-9731-5ae805318244\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:26Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.959947 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.959985 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.960013 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.960032 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.960044 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:26Z","lastTransitionTime":"2025-11-22T04:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:26 crc kubenswrapper[4699]: E1122 04:08:26.972721 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4852b328-c4f8-4280-9881-83927c94bf9a\\\",\\\"systemUUID\\\":\\\"76c96961-7d99-459e-9731-5ae805318244\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:26Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.977093 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.977162 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.977176 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.977209 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.977223 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:26Z","lastTransitionTime":"2025-11-22T04:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:26 crc kubenswrapper[4699]: E1122 04:08:26.991041 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4852b328-c4f8-4280-9881-83927c94bf9a\\\",\\\"systemUUID\\\":\\\"76c96961-7d99-459e-9731-5ae805318244\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:26Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.994724 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.994761 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.994770 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.994791 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:26 crc kubenswrapper[4699]: I1122 04:08:26.994805 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:26Z","lastTransitionTime":"2025-11-22T04:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:27 crc kubenswrapper[4699]: E1122 04:08:27.006246 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4852b328-c4f8-4280-9881-83927c94bf9a\\\",\\\"systemUUID\\\":\\\"76c96961-7d99-459e-9731-5ae805318244\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:27Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:27 crc kubenswrapper[4699]: E1122 04:08:27.006383 4699 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.008053 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.008173 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.008192 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.008221 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.008269 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:27Z","lastTransitionTime":"2025-11-22T04:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.111144 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.111224 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.111239 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.111257 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.111272 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:27Z","lastTransitionTime":"2025-11-22T04:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.214980 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.215038 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.215048 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.215070 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.215123 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:27Z","lastTransitionTime":"2025-11-22T04:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.317726 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.317810 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.317861 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.317902 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.318002 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:27Z","lastTransitionTime":"2025-11-22T04:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.421580 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.421637 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.421650 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.421667 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.421679 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:27Z","lastTransitionTime":"2025-11-22T04:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.447597 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:08:27 crc kubenswrapper[4699]: E1122 04:08:27.447832 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.525297 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.525365 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.525386 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.525412 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.525461 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:27Z","lastTransitionTime":"2025-11-22T04:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.628571 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.628648 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.628676 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.628710 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.628733 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:27Z","lastTransitionTime":"2025-11-22T04:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.732210 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.732267 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.732287 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.732315 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.732333 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:27Z","lastTransitionTime":"2025-11-22T04:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.835847 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.835903 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.835921 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.835943 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.835960 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:27Z","lastTransitionTime":"2025-11-22T04:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.937965 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.938042 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.938059 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.938088 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:27 crc kubenswrapper[4699]: I1122 04:08:27.938109 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:27Z","lastTransitionTime":"2025-11-22T04:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.041192 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.041369 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.041392 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.041422 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.041464 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:28Z","lastTransitionTime":"2025-11-22T04:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.144860 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.144927 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.144945 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.144973 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.144991 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:28Z","lastTransitionTime":"2025-11-22T04:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.248878 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.248959 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.248983 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.249017 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.249039 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:28Z","lastTransitionTime":"2025-11-22T04:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.352234 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.352286 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.352297 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.352316 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.352328 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:28Z","lastTransitionTime":"2025-11-22T04:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.446933 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.447173 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.447266 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:08:28 crc kubenswrapper[4699]: E1122 04:08:28.447265 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:08:28 crc kubenswrapper[4699]: E1122 04:08:28.447548 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:08:28 crc kubenswrapper[4699]: E1122 04:08:28.447630 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.454958 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.455036 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.455080 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.455113 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.455133 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:28Z","lastTransitionTime":"2025-11-22T04:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.558165 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.558243 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.558262 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.558288 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.558311 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:28Z","lastTransitionTime":"2025-11-22T04:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.661050 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.661119 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.661144 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.661175 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.661192 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:28Z","lastTransitionTime":"2025-11-22T04:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.764220 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.764267 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.764275 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.764292 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.764301 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:28Z","lastTransitionTime":"2025-11-22T04:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.867819 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.867897 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.867921 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.867969 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.868000 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:28Z","lastTransitionTime":"2025-11-22T04:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.971211 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.971274 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.971292 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.971321 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:28 crc kubenswrapper[4699]: I1122 04:08:28.971338 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:28Z","lastTransitionTime":"2025-11-22T04:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.074921 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.074978 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.074992 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.075015 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.075030 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:29Z","lastTransitionTime":"2025-11-22T04:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.177937 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.178001 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.178018 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.178077 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.178100 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:29Z","lastTransitionTime":"2025-11-22T04:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.281691 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.281734 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.281746 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.281764 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.281775 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:29Z","lastTransitionTime":"2025-11-22T04:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.384889 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.384927 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.384939 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.384958 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.384971 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:29Z","lastTransitionTime":"2025-11-22T04:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.446966 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:08:29 crc kubenswrapper[4699]: E1122 04:08:29.447105 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.448425 4699 scope.go:117] "RemoveContainer" containerID="4bc11f794671091f44b26888a6b2e95b17d76dec770be187a8ce9cea8c7c9688" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.466599 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e855881-4d77-4655-b4d7-a50fc081f993\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545a27e66130160ef1d8557458a64a27f18292c157e2e6dab9aa75aea0532ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35e9c8adb3bd9249f6d7e57cd40e40951af0463e49765ba635707120d07e8b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1538d20749062691aa2368004d22a46e612186aee24cb92acc3ddb073f616a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4a053080810e22083dda4eaba1155b7b547a214158f849f7e5778f2e37ccc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:29Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.491665 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.491900 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.491909 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.491927 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.491936 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:29Z","lastTransitionTime":"2025-11-22T04:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.495255 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b193b41e-aa0e-4816-b965-7b7873dadf85\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd4757f265f2b7a453efca645d83d5340e5ec206f6f9d40dd86010b90470498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1996517d6f55ae1765dd9d101fede2963e7ac51a406bca35cab95fa45192623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59408c7cd75594e068cdc4dadfec414fcc3d1604eea37ed708440fd1a4f019ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://516e9231111cee4a53c71bef07338222497c8ffb27edbfaddbcb2e58af61ae7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2097cbd81d5aedb02fafaae3f17840da75ab455e541c410ae2f70710548530ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:29Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.507678 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86ztb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d15248-9724-41b0-8370-66127cc18bbe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08e180e0857112708a5ca84fc45cd41b9aebc5eef5628d5666abc590d86242e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-799vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86ztb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:29Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.533982 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bc11f794671091f44b26888a6b2e95b17d76dec770be187a8ce9cea8c7c9688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bc11f794671091f44b26888a6b2e95b17d76dec770be187a8ce9cea8c7c9688\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"message\\\":\\\"cs/network-check-target for network=default are: map[]\\\\nI1122 04:08:05.185222 6150 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1122 04:08:05.185233 6150 services_controller.go:443] Built service openshift-network-diagnostics/network-check-target LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.219\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:80, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1122 04:08:05.185246 6150 services_controller.go:444] Built service openshift-network-diagnostics/network-check-target LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1122 04:08:05.185254 6150 services_controller.go:445] Built service openshift-network-diagnostics/network-check-target LB template configs for network=default: []services.lbConfig(nil)\\\\nF1122 04:08:05.185278 6150 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not ad\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:08:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-z7552_openshift-ovn-kubernetes(fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z7552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:29Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.547824 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gqt5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686f15a0-53ce-4d3f-80e2-7d6272dc7d4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5501c17b8d8e321c7b94254ed053f943531df548575931c4ec091997d68572a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxlj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cc1c0cd69753ab441348667255f1dc34d4eae5c0579a0f84eb5d6063f7970d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxlj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:08:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gqt5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:29Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.561842 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bdbae2-706a-4f84-9f56-5a42aec77762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc56d58ec38fe2e6ff34afa44193fd165159799c6184b7f1474c8b13087f257f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191befb5ec1036276709a4720f3cd8c40d63d14818bed55c5fac998489233619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kjwnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:29Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.577448 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b7225" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e5e536a-6797-4e6f-8160-1e23ddda1647\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07e7b4e6ae273aa9999ce9d0f198b8a9317611f11ddb313258aed23e3feff339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b7225\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:29Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.592698 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653394-4b4d-4c44-bc9d-39f2eeadbee4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e08c778826ca87eedf7169382d30509a5d31e132f5c91ff2cf633a24e3a7dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb226d8acfbc46b2a51a6c4ef5c04c1e17d99e9e82bad5950ccb4356fcc39eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c1d8b6512002b090f6fa191cc3dc7d55aeae6d135bca5df2c367fb2a4f68c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc4bf8d58b05d0044acc289a36a4eb6a4de51d5d0643239ff81fd7faff4531d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 04:07:50.127900 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 04:07:50.128059 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:07:50.128926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2923111326/tls.crt::/tmp/serving-cert-2923111326/tls.key\\\\\\\"\\\\nI1122 04:07:50.418529 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:07:50.432499 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:07:50.432593 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:07:50.432650 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:07:50.432686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:07:50.439773 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1122 04:07:50.439810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 04:07:50.439829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439834 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:07:50.439842 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:07:50.439844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:07:50.439864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:07:50.442112 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e25f8f28cc3aca76ae535aa6084bd1f994cbd0eb679f6ea40938a7fe456b0e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:29Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.594579 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.594605 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.594617 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.594636 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.594648 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:29Z","lastTransitionTime":"2025-11-22T04:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.609579 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:29Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.624683 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c43ee45b5065b7baee9b0025b5a73b4915b4577169a35be4378acf0e7cb603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:29Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.636180 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h6ndp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd066499-5bd5-459c-8a02-d02f716c8965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822e0ef5b78e9c1b19b56d52c7eed8ad0058cc30b405b2adf0e2a572afdaab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hhkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h6ndp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:29Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.648816 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fe0275b-9174-4aab-9f0f-7c00a233de69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3141a4a35fe91db661f1bbb69f481d1db9302e79a16e9bc2898f2fd5fbe0f445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e418cb4f331bd30b224110514a5d766e31fd949210ed6eb5ea3e1e04b2f62d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb5d783b1e21eb55efe9affd3962651d2bc2f2345954fa40a00e5f9b481066fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6860d3b5c86b1ad3bd55fc98a44e7fd84d66a5237df59f47319f598420b0241f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6860d3b5c86b1ad3bd55fc98a44e7fd84d66a5237df59f47319f598420b0241f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:29Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.659674 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c858c4eaa869f479d0fbd62eadd41218ca8dddc7ae5ffd82d36977acde2e76ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:29Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.673325 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:29Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.689162 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pmtb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5af0f83551d8cf679ee04fbc3995afe66769f74480211fb104ebf2d6d0f9ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccx9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pmtb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:29Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.709034 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.709452 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.709608 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.709743 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.709863 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:29Z","lastTransitionTime":"2025-11-22T04:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.724624 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:29Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.754893 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bfcbb63b703f8f023d54028af9011b37da8d2f7c9ac57e35129cd783f301876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99bfafe09aabfb9e3715d3c7af12849e0c8cb66e5799011c8463c5043383fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:29Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.769232 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pj52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82be5d0c-6f95-43e4-aa3c-9c56de3e200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:08:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pj52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:29Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.812779 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.812815 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.812823 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.812838 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.812914 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:29Z","lastTransitionTime":"2025-11-22T04:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.914931 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.914967 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.914977 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.914992 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.915003 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:29Z","lastTransitionTime":"2025-11-22T04:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.946355 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z7552_fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3/ovnkube-controller/1.log" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.951663 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" event={"ID":"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3","Type":"ContainerStarted","Data":"aa68142f0ff1c2e1bd7c2534395b616a4b68c5e8dc9d16c6d10709b1ed3d8455"} Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.952182 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.970400 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653394-4b4d-4c44-bc9d-39f2eeadbee4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e08c778826ca87eedf7169382d30509a5d31e132f5c91ff2cf633a24e3a7dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb226d8acfbc46b2a51a6c4ef5c04c1e17d99e9e82bad5950ccb4356fcc39eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c1d8b6512002b090f6fa191cc3dc7d55aeae6d135bca5df2c367fb2a4f68c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc4bf8d58b05d0044acc289a36a4eb6a4de51d5d0643239ff81fd7faff4531d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 04:07:50.127900 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 04:07:50.128059 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:07:50.128926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2923111326/tls.crt::/tmp/serving-cert-2923111326/tls.key\\\\\\\"\\\\nI1122 04:07:50.418529 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:07:50.432499 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:07:50.432593 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:07:50.432650 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:07:50.432686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:07:50.439773 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1122 04:07:50.439810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 04:07:50.439829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439834 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:07:50.439842 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:07:50.439844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:07:50.439864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:07:50.442112 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e25f8f28cc3aca76ae535aa6084bd1f994cbd0eb679f6ea40938a7fe456b0e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:29Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:29 crc kubenswrapper[4699]: I1122 04:08:29.988389 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:29Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.001795 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c43ee45b5065b7baee9b0025b5a73b4915b4577169a35be4378acf0e7cb603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:29Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.012170 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h6ndp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd066499-5bd5-459c-8a02-d02f716c8965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822e0ef5b78e9c1b19b56d52c7eed8ad0058cc30b405b2adf0e2a572afdaab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hhkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h6ndp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:30Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.017369 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.017421 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.017454 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.017476 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.017492 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:30Z","lastTransitionTime":"2025-11-22T04:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.027273 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bdbae2-706a-4f84-9f56-5a42aec77762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc56d58ec38fe2e6ff34afa44193fd165159799c6184b7f1474c8b13087f257f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191befb5ec1036276709a4720f3cd8c40d63d14818bed55c5fac998489233619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kjwnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:30Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.043123 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b7225" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e5e536a-6797-4e6f-8160-1e23ddda1647\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07e7b4e6ae273aa9999ce9d0f198b8a9317611f11ddb313258aed23e3feff339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b7225\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:30Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.055860 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fe0275b-9174-4aab-9f0f-7c00a233de69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3141a4a35fe91db661f1bbb69f481d1db9302e79a16e9bc2898f2fd5fbe0f445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e418cb4f331bd30b224110514a5d766e31fd949210ed6eb5ea3e1e04b2f62d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb5d783b1e21eb55efe9affd3962651d2bc2f2345954fa40a00e5f9b481066fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6860d3b5c86b1ad3bd55fc98a44e7fd84d66a5237df59f47319f598420b0241f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6860d3b5c86b1ad3bd55fc98a44e7fd84d66a5237df59f47319f598420b0241f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:30Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.071032 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c858c4eaa869f479d0fbd62eadd41218ca8dddc7ae5ffd82d36977acde2e76ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:30Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.086357 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:30Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.106670 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pmtb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5af0f83551d8cf679ee04fbc3995afe66769f74480211fb104ebf2d6d0f9ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccx9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pmtb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:30Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.120247 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.120299 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.120316 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.120337 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.120353 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:30Z","lastTransitionTime":"2025-11-22T04:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.126787 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:30Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.142498 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bfcbb63b703f8f023d54028af9011b37da8d2f7c9ac57e35129cd783f301876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99bfafe09aabfb9e3715d3c7af12849e0c8cb66e5799011c8463c5043383fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:30Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.155646 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pj52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82be5d0c-6f95-43e4-aa3c-9c56de3e200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:08:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pj52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:30Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.171528 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e855881-4d77-4655-b4d7-a50fc081f993\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545a27e66130160ef1d8557458a64a27f18292c157e2e6dab9aa75aea0532ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35e9c8adb3bd9249f6d7e57cd40e40951af0463e49765ba635707120d07e8b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1538d20749062691aa2368004d22a46e612186aee24cb92acc3ddb073f616a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4a053080810e22083dda4eaba1155b7b547a214158f849f7e5778f2e37ccc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:30Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.194720 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b193b41e-aa0e-4816-b965-7b7873dadf85\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd4757f265f2b7a453efca645d83d5340e5ec206f6f9d40dd86010b90470498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1996517d6f55ae1765dd9d101fede2963e7ac51a406bca35cab95fa45192623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59408c7cd75594e068cdc4dadfec414fcc3d1604eea37ed708440fd1a4f019ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://516e9231111cee4a53c71bef07338222497c8ffb27edbfaddbcb2e58af61ae7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2097cbd81d5aedb02fafaae3f17840da75ab455e541c410ae2f70710548530ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:30Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.206522 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86ztb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d15248-9724-41b0-8370-66127cc18bbe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08e180e0857112708a5ca84fc45cd41b9aebc5eef5628d5666abc590d86242e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-799vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86ztb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:30Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.222987 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.223027 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.223035 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.223052 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.223061 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:30Z","lastTransitionTime":"2025-11-22T04:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.228096 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa68142f0ff1c2e1bd7c2534395b616a4b68c5e8dc9d16c6d10709b1ed3d8455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bc11f794671091f44b26888a6b2e95b17d76dec770be187a8ce9cea8c7c9688\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"message\\\":\\\"cs/network-check-target for network=default are: map[]\\\\nI1122 04:08:05.185222 6150 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1122 04:08:05.185233 6150 services_controller.go:443] Built service openshift-network-diagnostics/network-check-target LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.219\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:80, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1122 04:08:05.185246 6150 services_controller.go:444] Built service openshift-network-diagnostics/network-check-target LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1122 04:08:05.185254 6150 services_controller.go:445] Built service openshift-network-diagnostics/network-check-target LB template configs for network=default: []services.lbConfig(nil)\\\\nF1122 04:08:05.185278 6150 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not ad\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:08:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z7552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:30Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.240082 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gqt5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686f15a0-53ce-4d3f-80e2-7d6272dc7d4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5501c17b8d8e321c7b94254ed053f943531df548575931c4ec091997d68572a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxlj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cc1c0cd69753ab441348667255f1dc34d4eae5c0579a0f84eb5d6063f7970d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxlj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:08:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gqt5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:30Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.325451 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.325505 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.325516 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.325537 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.325550 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:30Z","lastTransitionTime":"2025-11-22T04:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.428542 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.428607 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.428626 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.428656 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.428676 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:30Z","lastTransitionTime":"2025-11-22T04:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.447091 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.447184 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:08:30 crc kubenswrapper[4699]: E1122 04:08:30.447218 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.447373 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:08:30 crc kubenswrapper[4699]: E1122 04:08:30.447363 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:08:30 crc kubenswrapper[4699]: E1122 04:08:30.447421 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.531823 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.531907 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.531924 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.531946 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.531962 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:30Z","lastTransitionTime":"2025-11-22T04:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.634685 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.634742 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.634761 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.634784 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.634801 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:30Z","lastTransitionTime":"2025-11-22T04:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.738954 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.739034 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.739062 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.739095 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.739120 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:30Z","lastTransitionTime":"2025-11-22T04:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.843604 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.843669 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.843693 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.843725 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.843747 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:30Z","lastTransitionTime":"2025-11-22T04:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.946813 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.946874 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.946891 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.946919 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.946947 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:30Z","lastTransitionTime":"2025-11-22T04:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.958086 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z7552_fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3/ovnkube-controller/2.log" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.959278 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z7552_fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3/ovnkube-controller/1.log" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.963283 4699 generic.go:334] "Generic (PLEG): container finished" podID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerID="aa68142f0ff1c2e1bd7c2534395b616a4b68c5e8dc9d16c6d10709b1ed3d8455" exitCode=1 Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.963321 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" event={"ID":"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3","Type":"ContainerDied","Data":"aa68142f0ff1c2e1bd7c2534395b616a4b68c5e8dc9d16c6d10709b1ed3d8455"} Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.963361 4699 scope.go:117] "RemoveContainer" containerID="4bc11f794671091f44b26888a6b2e95b17d76dec770be187a8ce9cea8c7c9688" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.964925 4699 scope.go:117] "RemoveContainer" containerID="aa68142f0ff1c2e1bd7c2534395b616a4b68c5e8dc9d16c6d10709b1ed3d8455" Nov 22 04:08:30 crc kubenswrapper[4699]: E1122 04:08:30.965251 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-z7552_openshift-ovn-kubernetes(fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" Nov 22 04:08:30 crc kubenswrapper[4699]: I1122 04:08:30.988541 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653394-4b4d-4c44-bc9d-39f2eeadbee4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e08c778826ca87eedf7169382d30509a5d31e132f5c91ff2cf633a24e3a7dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb226d8acfbc46b2a51a6c4ef5c04c1e17d99e9e82bad5950ccb4356fcc39eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c1d8b6512002b090f6fa191cc3dc7d55aeae6d135bca5df2c367fb2a4f68c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc4bf8d58b05d0044acc289a36a4eb6a4de51d5d0643239ff81fd7faff4531d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 04:07:50.127900 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 04:07:50.128059 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:07:50.128926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2923111326/tls.crt::/tmp/serving-cert-2923111326/tls.key\\\\\\\"\\\\nI1122 04:07:50.418529 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:07:50.432499 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:07:50.432593 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:07:50.432650 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:07:50.432686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:07:50.439773 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1122 04:07:50.439810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 04:07:50.439829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439834 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:07:50.439842 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:07:50.439844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:07:50.439864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:07:50.442112 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e25f8f28cc3aca76ae535aa6084bd1f994cbd0eb679f6ea40938a7fe456b0e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:30Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.008120 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:31Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.020093 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c43ee45b5065b7baee9b0025b5a73b4915b4577169a35be4378acf0e7cb603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:31Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.032657 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h6ndp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd066499-5bd5-459c-8a02-d02f716c8965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822e0ef5b78e9c1b19b56d52c7eed8ad0058cc30b405b2adf0e2a572afdaab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hhkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h6ndp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:31Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.046095 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bdbae2-706a-4f84-9f56-5a42aec77762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc56d58ec38fe2e6ff34afa44193fd165159799c6184b7f1474c8b13087f257f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191befb5ec1036276709a4720f3cd8c40d63d14818bed55c5fac998489233619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kjwnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:31Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.051520 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.051546 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.051556 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.051573 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.051583 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:31Z","lastTransitionTime":"2025-11-22T04:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.070834 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b7225" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e5e536a-6797-4e6f-8160-1e23ddda1647\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07e7b4e6ae273aa9999ce9d0f198b8a9317611f11ddb313258aed23e3feff339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b7225\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:31Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.088407 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fe0275b-9174-4aab-9f0f-7c00a233de69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3141a4a35fe91db661f1bbb69f481d1db9302e79a16e9bc2898f2fd5fbe0f445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e418cb4f331bd30b224110514a5d766e31fd949210ed6eb5ea3e1e04b2f62d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb5d783b1e21eb55efe9affd3962651d2bc2f2345954fa40a00e5f9b481066fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6860d3b5c86b1ad3bd55fc98a44e7fd84d66a5237df59f47319f598420b0241f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6860d3b5c86b1ad3bd55fc98a44e7fd84d66a5237df59f47319f598420b0241f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:31Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.105251 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c858c4eaa869f479d0fbd62eadd41218ca8dddc7ae5ffd82d36977acde2e76ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:31Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.121489 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:31Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.140978 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pmtb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5af0f83551d8cf679ee04fbc3995afe66769f74480211fb104ebf2d6d0f9ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccx9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pmtb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:31Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.154002 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.154244 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.154309 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.154380 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.154450 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:31Z","lastTransitionTime":"2025-11-22T04:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.161282 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:31Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.187648 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bfcbb63b703f8f023d54028af9011b37da8d2f7c9ac57e35129cd783f301876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99bfafe09aabfb9e3715d3c7af12849e0c8cb66e5799011c8463c5043383fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:31Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.207422 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pj52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82be5d0c-6f95-43e4-aa3c-9c56de3e200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:08:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pj52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:31Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.235332 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e855881-4d77-4655-b4d7-a50fc081f993\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545a27e66130160ef1d8557458a64a27f18292c157e2e6dab9aa75aea0532ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35e9c8adb3bd9249f6d7e57cd40e40951af0463e49765ba635707120d07e8b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1538d20749062691aa2368004d22a46e612186aee24cb92acc3ddb073f616a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4a053080810e22083dda4eaba1155b7b547a214158f849f7e5778f2e37ccc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:31Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.258059 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.258115 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.258133 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.258159 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.258177 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:31Z","lastTransitionTime":"2025-11-22T04:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.262499 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b193b41e-aa0e-4816-b965-7b7873dadf85\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd4757f265f2b7a453efca645d83d5340e5ec206f6f9d40dd86010b90470498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1996517d6f55ae1765dd9d101fede2963e7ac51a406bca35cab95fa45192623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59408c7cd75594e068cdc4dadfec414fcc3d1604eea37ed708440fd1a4f019ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://516e9231111cee4a53c71bef07338222497c8ffb27edbfaddbcb2e58af61ae7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2097cbd81d5aedb02fafaae3f17840da75ab455e541c410ae2f70710548530ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:31Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.281106 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86ztb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d15248-9724-41b0-8370-66127cc18bbe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08e180e0857112708a5ca84fc45cd41b9aebc5eef5628d5666abc590d86242e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-799vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86ztb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:31Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.313720 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa68142f0ff1c2e1bd7c2534395b616a4b68c5e8dc9d16c6d10709b1ed3d8455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bc11f794671091f44b26888a6b2e95b17d76dec770be187a8ce9cea8c7c9688\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"message\\\":\\\"cs/network-check-target for network=default are: map[]\\\\nI1122 04:08:05.185222 6150 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1122 04:08:05.185233 6150 services_controller.go:443] Built service openshift-network-diagnostics/network-check-target LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.219\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:80, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1122 04:08:05.185246 6150 services_controller.go:444] Built service openshift-network-diagnostics/network-check-target LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1122 04:08:05.185254 6150 services_controller.go:445] Built service openshift-network-diagnostics/network-check-target LB template configs for network=default: []services.lbConfig(nil)\\\\nF1122 04:08:05.185278 6150 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not ad\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:08:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa68142f0ff1c2e1bd7c2534395b616a4b68c5e8dc9d16c6d10709b1ed3d8455\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:08:30Z\\\",\\\"message\\\":\\\" 6438 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1122 04:08:30.300232 6438 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1122 04:08:30.300243 6438 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1122 04:08:30.300257 6438 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1122 04:08:30.300270 6438 factory.go:656] Stopping watch factory\\\\nI1122 04:08:30.300285 6438 ovnkube.go:599] Stopped ovnkube\\\\nI1122 04:08:30.300330 6438 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 04:08:30.300345 6438 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 04:08:30.300351 6438 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1122 04:08:30.300356 6438 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1122 04:08:30.300361 6438 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1122 04:08:30.300366 6438 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1122 04:08:30.300372 6438 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1122 04:08:30.300377 6438 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1122 04:08:30.300383 6438 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1122 04:08:30.300391 6438 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1122 04:08:30.300476 6438 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z7552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:31Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.332090 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gqt5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686f15a0-53ce-4d3f-80e2-7d6272dc7d4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5501c17b8d8e321c7b94254ed053f943531df548575931c4ec091997d68572a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxlj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cc1c0cd69753ab441348667255f1dc34d4eae5c0579a0f84eb5d6063f7970d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxlj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:08:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gqt5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:31Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.360218 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.360269 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.360282 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.360303 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.360323 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:31Z","lastTransitionTime":"2025-11-22T04:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.447577 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:08:31 crc kubenswrapper[4699]: E1122 04:08:31.447819 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.462850 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.462909 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.462922 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.462942 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.462957 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:31Z","lastTransitionTime":"2025-11-22T04:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.565908 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.565947 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.565957 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.565976 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.565989 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:31Z","lastTransitionTime":"2025-11-22T04:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.668188 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.668250 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.668266 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.668287 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.668301 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:31Z","lastTransitionTime":"2025-11-22T04:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.758533 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.771040 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.771110 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.771132 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.771164 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.771187 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:31Z","lastTransitionTime":"2025-11-22T04:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.779513 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e855881-4d77-4655-b4d7-a50fc081f993\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545a27e66130160ef1d8557458a64a27f18292c157e2e6dab9aa75aea0532ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35e9c8adb3bd9249f6d7e57cd40e40951af0463e49765ba635707120d07e8b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1538d20749062691aa2368004d22a46e612186aee24cb92acc3ddb073f616a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4a053080810e22083dda4eaba1155b7b547a214158f849f7e5778f2e37ccc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:31Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.812421 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b193b41e-aa0e-4816-b965-7b7873dadf85\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd4757f265f2b7a453efca645d83d5340e5ec206f6f9d40dd86010b90470498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1996517d6f55ae1765dd9d101fede2963e7ac51a406bca35cab95fa45192623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59408c7cd75594e068cdc4dadfec414fcc3d1604eea37ed708440fd1a4f019ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://516e9231111cee4a53c71bef07338222497c8ffb27edbfaddbcb2e58af61ae7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2097cbd81d5aedb02fafaae3f17840da75ab455e541c410ae2f70710548530ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:31Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.824787 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86ztb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d15248-9724-41b0-8370-66127cc18bbe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08e180e0857112708a5ca84fc45cd41b9aebc5eef5628d5666abc590d86242e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-799vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86ztb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:31Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.842825 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa68142f0ff1c2e1bd7c2534395b616a4b68c5e8dc9d16c6d10709b1ed3d8455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bc11f794671091f44b26888a6b2e95b17d76dec770be187a8ce9cea8c7c9688\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"message\\\":\\\"cs/network-check-target for network=default are: map[]\\\\nI1122 04:08:05.185222 6150 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1122 04:08:05.185233 6150 services_controller.go:443] Built service openshift-network-diagnostics/network-check-target LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.219\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:80, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1122 04:08:05.185246 6150 services_controller.go:444] Built service openshift-network-diagnostics/network-check-target LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1122 04:08:05.185254 6150 services_controller.go:445] Built service openshift-network-diagnostics/network-check-target LB template configs for network=default: []services.lbConfig(nil)\\\\nF1122 04:08:05.185278 6150 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not ad\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:08:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa68142f0ff1c2e1bd7c2534395b616a4b68c5e8dc9d16c6d10709b1ed3d8455\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:08:30Z\\\",\\\"message\\\":\\\" 6438 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1122 04:08:30.300232 6438 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1122 04:08:30.300243 6438 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1122 04:08:30.300257 6438 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1122 04:08:30.300270 6438 factory.go:656] Stopping watch factory\\\\nI1122 04:08:30.300285 6438 ovnkube.go:599] Stopped ovnkube\\\\nI1122 04:08:30.300330 6438 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 04:08:30.300345 6438 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 04:08:30.300351 6438 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1122 04:08:30.300356 6438 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1122 04:08:30.300361 6438 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1122 04:08:30.300366 6438 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1122 04:08:30.300372 6438 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1122 04:08:30.300377 6438 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1122 04:08:30.300383 6438 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1122 04:08:30.300391 6438 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1122 04:08:30.300476 6438 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:08:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z7552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:31Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.853953 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gqt5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686f15a0-53ce-4d3f-80e2-7d6272dc7d4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5501c17b8d8e321c7b94254ed053f943531df548575931c4ec091997d68572a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxlj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cc1c0cd69753ab441348667255f1dc34d4eae5c0579a0f84eb5d6063f7970d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxlj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:08:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gqt5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:31Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.867632 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653394-4b4d-4c44-bc9d-39f2eeadbee4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e08c778826ca87eedf7169382d30509a5d31e132f5c91ff2cf633a24e3a7dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb226d8acfbc46b2a51a6c4ef5c04c1e17d99e9e82bad5950ccb4356fcc39eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c1d8b6512002b090f6fa191cc3dc7d55aeae6d135bca5df2c367fb2a4f68c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc4bf8d58b05d0044acc289a36a4eb6a4de51d5d0643239ff81fd7faff4531d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 04:07:50.127900 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 04:07:50.128059 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:07:50.128926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2923111326/tls.crt::/tmp/serving-cert-2923111326/tls.key\\\\\\\"\\\\nI1122 04:07:50.418529 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:07:50.432499 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:07:50.432593 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:07:50.432650 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:07:50.432686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:07:50.439773 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1122 04:07:50.439810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 04:07:50.439829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439834 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:07:50.439842 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:07:50.439844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:07:50.439864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:07:50.442112 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e25f8f28cc3aca76ae535aa6084bd1f994cbd0eb679f6ea40938a7fe456b0e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:31Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.872859 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.872913 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.872964 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.872988 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.873004 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:31Z","lastTransitionTime":"2025-11-22T04:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.880712 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:31Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.894125 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c43ee45b5065b7baee9b0025b5a73b4915b4577169a35be4378acf0e7cb603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:31Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.904160 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h6ndp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd066499-5bd5-459c-8a02-d02f716c8965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822e0ef5b78e9c1b19b56d52c7eed8ad0058cc30b405b2adf0e2a572afdaab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hhkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h6ndp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:31Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.915742 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bdbae2-706a-4f84-9f56-5a42aec77762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc56d58ec38fe2e6ff34afa44193fd165159799c6184b7f1474c8b13087f257f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191befb5ec1036276709a4720f3cd8c40d63d14818bed55c5fac998489233619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kjwnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:31Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.929253 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b7225" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e5e536a-6797-4e6f-8160-1e23ddda1647\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07e7b4e6ae273aa9999ce9d0f198b8a9317611f11ddb313258aed23e3feff339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b7225\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:31Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.944177 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fe0275b-9174-4aab-9f0f-7c00a233de69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3141a4a35fe91db661f1bbb69f481d1db9302e79a16e9bc2898f2fd5fbe0f445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e418cb4f331bd30b224110514a5d766e31fd949210ed6eb5ea3e1e04b2f62d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb5d783b1e21eb55efe9affd3962651d2bc2f2345954fa40a00e5f9b481066fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6860d3b5c86b1ad3bd55fc98a44e7fd84d66a5237df59f47319f598420b0241f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6860d3b5c86b1ad3bd55fc98a44e7fd84d66a5237df59f47319f598420b0241f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:31Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.956871 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c858c4eaa869f479d0fbd62eadd41218ca8dddc7ae5ffd82d36977acde2e76ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:31Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.968256 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:31Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.968470 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z7552_fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3/ovnkube-controller/2.log" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.972178 4699 scope.go:117] "RemoveContainer" containerID="aa68142f0ff1c2e1bd7c2534395b616a4b68c5e8dc9d16c6d10709b1ed3d8455" Nov 22 04:08:31 crc kubenswrapper[4699]: E1122 04:08:31.972457 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-z7552_openshift-ovn-kubernetes(fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.975508 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.975606 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.975672 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.975746 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.975811 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:31Z","lastTransitionTime":"2025-11-22T04:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.982252 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pmtb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5af0f83551d8cf679ee04fbc3995afe66769f74480211fb104ebf2d6d0f9ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccx9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pmtb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:31Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:31 crc kubenswrapper[4699]: I1122 04:08:31.992421 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:31Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.005747 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bfcbb63b703f8f023d54028af9011b37da8d2f7c9ac57e35129cd783f301876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99bfafe09aabfb9e3715d3c7af12849e0c8cb66e5799011c8463c5043383fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:32Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.018882 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pj52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82be5d0c-6f95-43e4-aa3c-9c56de3e200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:08:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pj52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:32Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.033512 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653394-4b4d-4c44-bc9d-39f2eeadbee4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e08c778826ca87eedf7169382d30509a5d31e132f5c91ff2cf633a24e3a7dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb226d8acfbc46b2a51a6c4ef5c04c1e17d99e9e82bad5950ccb4356fcc39eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c1d8b6512002b090f6fa191cc3dc7d55aeae6d135bca5df2c367fb2a4f68c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc4bf8d58b05d0044acc289a36a4eb6a4de51d5d0643239ff81fd7faff4531d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 04:07:50.127900 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 04:07:50.128059 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:07:50.128926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2923111326/tls.crt::/tmp/serving-cert-2923111326/tls.key\\\\\\\"\\\\nI1122 04:07:50.418529 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:07:50.432499 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:07:50.432593 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:07:50.432650 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:07:50.432686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:07:50.439773 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1122 04:07:50.439810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 04:07:50.439829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439834 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:07:50.439842 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:07:50.439844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:07:50.439864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:07:50.442112 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e25f8f28cc3aca76ae535aa6084bd1f994cbd0eb679f6ea40938a7fe456b0e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:32Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.047968 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:32Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.061327 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c43ee45b5065b7baee9b0025b5a73b4915b4577169a35be4378acf0e7cb603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:32Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.073997 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h6ndp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd066499-5bd5-459c-8a02-d02f716c8965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822e0ef5b78e9c1b19b56d52c7eed8ad0058cc30b405b2adf0e2a572afdaab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hhkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h6ndp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:32Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.077655 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.077778 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.077793 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.077816 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.077831 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:32Z","lastTransitionTime":"2025-11-22T04:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.087293 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bdbae2-706a-4f84-9f56-5a42aec77762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc56d58ec38fe2e6ff34afa44193fd165159799c6184b7f1474c8b13087f257f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191befb5ec1036276709a4720f3cd8c40d63d14818bed55c5fac998489233619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kjwnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:32Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.106126 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b7225" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e5e536a-6797-4e6f-8160-1e23ddda1647\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07e7b4e6ae273aa9999ce9d0f198b8a9317611f11ddb313258aed23e3feff339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b7225\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:32Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.118791 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fe0275b-9174-4aab-9f0f-7c00a233de69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3141a4a35fe91db661f1bbb69f481d1db9302e79a16e9bc2898f2fd5fbe0f445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e418cb4f331bd30b224110514a5d766e31fd949210ed6eb5ea3e1e04b2f62d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb5d783b1e21eb55efe9affd3962651d2bc2f2345954fa40a00e5f9b481066fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6860d3b5c86b1ad3bd55fc98a44e7fd84d66a5237df59f47319f598420b0241f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6860d3b5c86b1ad3bd55fc98a44e7fd84d66a5237df59f47319f598420b0241f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:32Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.130847 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c858c4eaa869f479d0fbd62eadd41218ca8dddc7ae5ffd82d36977acde2e76ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:32Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.142392 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:32Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.157953 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pmtb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5af0f83551d8cf679ee04fbc3995afe66769f74480211fb104ebf2d6d0f9ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccx9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pmtb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:32Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.170275 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:32Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.180721 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.180837 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.180868 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.180898 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.180925 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:32Z","lastTransitionTime":"2025-11-22T04:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.186892 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bfcbb63b703f8f023d54028af9011b37da8d2f7c9ac57e35129cd783f301876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99bfafe09aabfb9e3715d3c7af12849e0c8cb66e5799011c8463c5043383fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:32Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.202231 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pj52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82be5d0c-6f95-43e4-aa3c-9c56de3e200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:08:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pj52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:32Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.222143 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e855881-4d77-4655-b4d7-a50fc081f993\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545a27e66130160ef1d8557458a64a27f18292c157e2e6dab9aa75aea0532ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35e9c8adb3bd9249f6d7e57cd40e40951af0463e49765ba635707120d07e8b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1538d20749062691aa2368004d22a46e612186aee24cb92acc3ddb073f616a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4a053080810e22083dda4eaba1155b7b547a214158f849f7e5778f2e37ccc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:32Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.246594 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b193b41e-aa0e-4816-b965-7b7873dadf85\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd4757f265f2b7a453efca645d83d5340e5ec206f6f9d40dd86010b90470498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1996517d6f55ae1765dd9d101fede2963e7ac51a406bca35cab95fa45192623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59408c7cd75594e068cdc4dadfec414fcc3d1604eea37ed708440fd1a4f019ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://516e9231111cee4a53c71bef07338222497c8ffb27edbfaddbcb2e58af61ae7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2097cbd81d5aedb02fafaae3f17840da75ab455e541c410ae2f70710548530ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:32Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.257817 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86ztb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d15248-9724-41b0-8370-66127cc18bbe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08e180e0857112708a5ca84fc45cd41b9aebc5eef5628d5666abc590d86242e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-799vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86ztb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:32Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.277389 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa68142f0ff1c2e1bd7c2534395b616a4b68c5e8dc9d16c6d10709b1ed3d8455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa68142f0ff1c2e1bd7c2534395b616a4b68c5e8dc9d16c6d10709b1ed3d8455\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:08:30Z\\\",\\\"message\\\":\\\" 6438 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1122 04:08:30.300232 6438 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1122 04:08:30.300243 6438 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1122 04:08:30.300257 6438 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1122 04:08:30.300270 6438 factory.go:656] Stopping watch factory\\\\nI1122 04:08:30.300285 6438 ovnkube.go:599] Stopped ovnkube\\\\nI1122 04:08:30.300330 6438 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 04:08:30.300345 6438 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 04:08:30.300351 6438 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1122 04:08:30.300356 6438 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1122 04:08:30.300361 6438 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1122 04:08:30.300366 6438 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1122 04:08:30.300372 6438 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1122 04:08:30.300377 6438 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1122 04:08:30.300383 6438 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1122 04:08:30.300391 6438 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1122 04:08:30.300476 6438 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:08:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-z7552_openshift-ovn-kubernetes(fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z7552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:32Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.284076 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.284117 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.284129 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.284146 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.284160 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:32Z","lastTransitionTime":"2025-11-22T04:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.294937 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gqt5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686f15a0-53ce-4d3f-80e2-7d6272dc7d4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5501c17b8d8e321c7b94254ed053f943531df548575931c4ec091997d68572a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxlj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cc1c0cd69753ab441348667255f1dc34d4eae5c0579a0f84eb5d6063f7970d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxlj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:08:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gqt5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:32Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.389852 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.389906 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.389924 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.389947 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.389963 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:32Z","lastTransitionTime":"2025-11-22T04:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.447612 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:08:32 crc kubenswrapper[4699]: E1122 04:08:32.447747 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.447930 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:08:32 crc kubenswrapper[4699]: E1122 04:08:32.447976 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.448070 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:08:32 crc kubenswrapper[4699]: E1122 04:08:32.448110 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.493177 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.493244 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.493263 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.493288 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.493305 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:32Z","lastTransitionTime":"2025-11-22T04:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.596202 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.596254 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.596267 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.596293 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.596307 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:32Z","lastTransitionTime":"2025-11-22T04:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.699726 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.699774 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.699783 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.699801 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.699812 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:32Z","lastTransitionTime":"2025-11-22T04:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.802229 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.802279 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.802295 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.802319 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.802354 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:32Z","lastTransitionTime":"2025-11-22T04:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.905258 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.905725 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.905752 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.905783 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:32 crc kubenswrapper[4699]: I1122 04:08:32.905802 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:32Z","lastTransitionTime":"2025-11-22T04:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.008240 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.008278 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.008288 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.008305 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.008315 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:33Z","lastTransitionTime":"2025-11-22T04:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.111122 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.111187 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.111203 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.111229 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.111246 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:33Z","lastTransitionTime":"2025-11-22T04:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.213811 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.213872 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.213890 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.213916 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.213934 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:33Z","lastTransitionTime":"2025-11-22T04:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.317633 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.317694 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.317713 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.317736 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.317753 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:33Z","lastTransitionTime":"2025-11-22T04:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.421010 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.421088 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.421113 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.421149 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.421176 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:33Z","lastTransitionTime":"2025-11-22T04:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.447796 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:08:33 crc kubenswrapper[4699]: E1122 04:08:33.448014 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.523918 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.523958 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.523970 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.523987 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.523998 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:33Z","lastTransitionTime":"2025-11-22T04:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.626359 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.626410 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.626450 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.626479 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.626497 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:33Z","lastTransitionTime":"2025-11-22T04:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.729343 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.729447 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.729460 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.729516 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.729532 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:33Z","lastTransitionTime":"2025-11-22T04:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.831799 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.831843 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.831855 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.831872 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.831886 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:33Z","lastTransitionTime":"2025-11-22T04:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.933775 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.933847 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.933857 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.933882 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:33 crc kubenswrapper[4699]: I1122 04:08:33.933893 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:33Z","lastTransitionTime":"2025-11-22T04:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.036188 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.036248 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.036263 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.036284 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.036296 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:34Z","lastTransitionTime":"2025-11-22T04:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.139296 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.139338 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.139353 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.139370 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.139382 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:34Z","lastTransitionTime":"2025-11-22T04:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.242607 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.242644 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.242655 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.242673 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.242686 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:34Z","lastTransitionTime":"2025-11-22T04:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.344546 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.344584 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.344593 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.344618 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.344628 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:34Z","lastTransitionTime":"2025-11-22T04:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.446799 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.446822 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.446849 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:08:34 crc kubenswrapper[4699]: E1122 04:08:34.446956 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.447036 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.447061 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.447072 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.447088 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.447100 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:34Z","lastTransitionTime":"2025-11-22T04:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:34 crc kubenswrapper[4699]: E1122 04:08:34.447116 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:08:34 crc kubenswrapper[4699]: E1122 04:08:34.447198 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.549326 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.549393 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.549406 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.549427 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.549454 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:34Z","lastTransitionTime":"2025-11-22T04:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.651802 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.651874 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.651888 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.651906 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.651919 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:34Z","lastTransitionTime":"2025-11-22T04:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.754345 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.754390 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.754398 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.754415 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.754425 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:34Z","lastTransitionTime":"2025-11-22T04:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.857404 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.857475 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.857491 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.857514 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.857527 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:34Z","lastTransitionTime":"2025-11-22T04:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.960117 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.960161 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.960170 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.960196 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:34 crc kubenswrapper[4699]: I1122 04:08:34.960205 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:34Z","lastTransitionTime":"2025-11-22T04:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.063531 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.063605 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.063616 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.063638 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.063649 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:35Z","lastTransitionTime":"2025-11-22T04:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.165751 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.165791 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.165799 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.165814 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.165825 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:35Z","lastTransitionTime":"2025-11-22T04:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.268411 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.268470 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.268481 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.268501 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.268513 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:35Z","lastTransitionTime":"2025-11-22T04:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.371456 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.371494 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.371502 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.371518 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.371531 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:35Z","lastTransitionTime":"2025-11-22T04:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.447164 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:08:35 crc kubenswrapper[4699]: E1122 04:08:35.447353 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.474161 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.474229 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.474251 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.474280 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.474301 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:35Z","lastTransitionTime":"2025-11-22T04:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.577890 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.577976 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.577996 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.578023 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.578042 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:35Z","lastTransitionTime":"2025-11-22T04:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.680334 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.680401 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.680423 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.680487 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.680511 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:35Z","lastTransitionTime":"2025-11-22T04:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.783286 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.783340 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.783354 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.783372 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.783385 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:35Z","lastTransitionTime":"2025-11-22T04:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.885951 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.886090 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.886110 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.886136 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.886187 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:35Z","lastTransitionTime":"2025-11-22T04:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.989099 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.989168 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.989187 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.989217 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:35 crc kubenswrapper[4699]: I1122 04:08:35.989235 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:35Z","lastTransitionTime":"2025-11-22T04:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:36 crc kubenswrapper[4699]: I1122 04:08:36.091577 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:36 crc kubenswrapper[4699]: I1122 04:08:36.091630 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:36 crc kubenswrapper[4699]: I1122 04:08:36.091643 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:36 crc kubenswrapper[4699]: I1122 04:08:36.091663 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:36 crc kubenswrapper[4699]: I1122 04:08:36.091676 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:36Z","lastTransitionTime":"2025-11-22T04:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:36 crc kubenswrapper[4699]: I1122 04:08:36.194364 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:36 crc kubenswrapper[4699]: I1122 04:08:36.194411 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:36 crc kubenswrapper[4699]: I1122 04:08:36.194423 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:36 crc kubenswrapper[4699]: I1122 04:08:36.194461 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:36 crc kubenswrapper[4699]: I1122 04:08:36.194473 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:36Z","lastTransitionTime":"2025-11-22T04:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:36 crc kubenswrapper[4699]: I1122 04:08:36.297231 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:36 crc kubenswrapper[4699]: I1122 04:08:36.297331 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:36 crc kubenswrapper[4699]: I1122 04:08:36.297342 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:36 crc kubenswrapper[4699]: I1122 04:08:36.297358 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:36 crc kubenswrapper[4699]: I1122 04:08:36.297371 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:36Z","lastTransitionTime":"2025-11-22T04:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:36 crc kubenswrapper[4699]: I1122 04:08:36.400806 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:36 crc kubenswrapper[4699]: I1122 04:08:36.400858 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:36 crc kubenswrapper[4699]: I1122 04:08:36.400870 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:36 crc kubenswrapper[4699]: I1122 04:08:36.400888 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:36 crc kubenswrapper[4699]: I1122 04:08:36.400900 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:36Z","lastTransitionTime":"2025-11-22T04:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:36 crc kubenswrapper[4699]: I1122 04:08:36.447322 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:08:36 crc kubenswrapper[4699]: I1122 04:08:36.447499 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:08:36 crc kubenswrapper[4699]: E1122 04:08:36.447623 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:08:36 crc kubenswrapper[4699]: I1122 04:08:36.447671 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:08:36 crc kubenswrapper[4699]: E1122 04:08:36.447821 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:08:36 crc kubenswrapper[4699]: E1122 04:08:36.447918 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:08:36 crc kubenswrapper[4699]: I1122 04:08:36.503419 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:36 crc kubenswrapper[4699]: I1122 04:08:36.503485 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:36 crc kubenswrapper[4699]: I1122 04:08:36.503496 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:36 crc kubenswrapper[4699]: I1122 04:08:36.503514 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:36 crc kubenswrapper[4699]: I1122 04:08:36.503528 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:36Z","lastTransitionTime":"2025-11-22T04:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:36 crc kubenswrapper[4699]: I1122 04:08:36.606052 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:36 crc kubenswrapper[4699]: I1122 04:08:36.606120 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:36 crc kubenswrapper[4699]: I1122 04:08:36.606136 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:36 crc kubenswrapper[4699]: I1122 04:08:36.606166 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:36 crc kubenswrapper[4699]: I1122 04:08:36.606184 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:36Z","lastTransitionTime":"2025-11-22T04:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:36 crc kubenswrapper[4699]: I1122 04:08:36.708759 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:36 crc kubenswrapper[4699]: I1122 04:08:36.708812 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:36 crc kubenswrapper[4699]: I1122 04:08:36.708827 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:36 crc kubenswrapper[4699]: I1122 04:08:36.708851 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:36 crc kubenswrapper[4699]: I1122 04:08:36.708865 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:36Z","lastTransitionTime":"2025-11-22T04:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:36 crc kubenswrapper[4699]: I1122 04:08:36.811599 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:36 crc kubenswrapper[4699]: I1122 04:08:36.811643 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:36 crc kubenswrapper[4699]: I1122 04:08:36.811660 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:36 crc kubenswrapper[4699]: I1122 04:08:36.811683 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:36 crc kubenswrapper[4699]: I1122 04:08:36.811699 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:36Z","lastTransitionTime":"2025-11-22T04:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:36 crc kubenswrapper[4699]: I1122 04:08:36.914421 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:36 crc kubenswrapper[4699]: I1122 04:08:36.914712 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:36 crc kubenswrapper[4699]: I1122 04:08:36.914724 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:36 crc kubenswrapper[4699]: I1122 04:08:36.914744 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:36 crc kubenswrapper[4699]: I1122 04:08:36.914757 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:36Z","lastTransitionTime":"2025-11-22T04:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.017362 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.017405 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.017421 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.017459 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.017472 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:37Z","lastTransitionTime":"2025-11-22T04:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.121079 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.121137 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.121147 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.121173 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.121187 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:37Z","lastTransitionTime":"2025-11-22T04:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.224962 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.225005 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.225014 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.225029 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.225042 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:37Z","lastTransitionTime":"2025-11-22T04:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.317678 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/82be5d0c-6f95-43e4-aa3c-9c56de3e200c-metrics-certs\") pod \"network-metrics-daemon-pj52w\" (UID: \"82be5d0c-6f95-43e4-aa3c-9c56de3e200c\") " pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:08:37 crc kubenswrapper[4699]: E1122 04:08:37.317906 4699 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 04:08:37 crc kubenswrapper[4699]: E1122 04:08:37.318037 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82be5d0c-6f95-43e4-aa3c-9c56de3e200c-metrics-certs podName:82be5d0c-6f95-43e4-aa3c-9c56de3e200c nodeName:}" failed. No retries permitted until 2025-11-22 04:09:09.317997362 +0000 UTC m=+100.660618549 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/82be5d0c-6f95-43e4-aa3c-9c56de3e200c-metrics-certs") pod "network-metrics-daemon-pj52w" (UID: "82be5d0c-6f95-43e4-aa3c-9c56de3e200c") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.328051 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.328118 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.328136 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.328169 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.328187 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:37Z","lastTransitionTime":"2025-11-22T04:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.389768 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.389814 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.389828 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.389846 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.389861 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:37Z","lastTransitionTime":"2025-11-22T04:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:37 crc kubenswrapper[4699]: E1122 04:08:37.409187 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4852b328-c4f8-4280-9881-83927c94bf9a\\\",\\\"systemUUID\\\":\\\"76c96961-7d99-459e-9731-5ae805318244\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:37Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.414511 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.414571 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.414590 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.414620 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.414637 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:37Z","lastTransitionTime":"2025-11-22T04:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:37 crc kubenswrapper[4699]: E1122 04:08:37.433235 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4852b328-c4f8-4280-9881-83927c94bf9a\\\",\\\"systemUUID\\\":\\\"76c96961-7d99-459e-9731-5ae805318244\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:37Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.441134 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.441186 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.441198 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.441217 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.441232 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:37Z","lastTransitionTime":"2025-11-22T04:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.447392 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:08:37 crc kubenswrapper[4699]: E1122 04:08:37.447631 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:08:37 crc kubenswrapper[4699]: E1122 04:08:37.460943 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4852b328-c4f8-4280-9881-83927c94bf9a\\\",\\\"systemUUID\\\":\\\"76c96961-7d99-459e-9731-5ae805318244\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:37Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.467480 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.467513 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.467524 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.467560 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.467572 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:37Z","lastTransitionTime":"2025-11-22T04:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:37 crc kubenswrapper[4699]: E1122 04:08:37.479961 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4852b328-c4f8-4280-9881-83927c94bf9a\\\",\\\"systemUUID\\\":\\\"76c96961-7d99-459e-9731-5ae805318244\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:37Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.483810 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.483906 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.483929 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.483959 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.483990 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:37Z","lastTransitionTime":"2025-11-22T04:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:37 crc kubenswrapper[4699]: E1122 04:08:37.496614 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4852b328-c4f8-4280-9881-83927c94bf9a\\\",\\\"systemUUID\\\":\\\"76c96961-7d99-459e-9731-5ae805318244\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:37Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:37 crc kubenswrapper[4699]: E1122 04:08:37.496780 4699 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.498853 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.498893 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.498902 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.498918 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.498930 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:37Z","lastTransitionTime":"2025-11-22T04:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.601927 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.601967 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.601978 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.601996 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.602006 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:37Z","lastTransitionTime":"2025-11-22T04:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.704949 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.704991 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.705003 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.705022 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.705033 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:37Z","lastTransitionTime":"2025-11-22T04:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.808411 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.808504 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.808524 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.808555 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.808574 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:37Z","lastTransitionTime":"2025-11-22T04:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.911620 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.911685 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.911694 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.911709 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:37 crc kubenswrapper[4699]: I1122 04:08:37.911722 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:37Z","lastTransitionTime":"2025-11-22T04:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.014333 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.014376 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.014385 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.014401 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.014412 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:38Z","lastTransitionTime":"2025-11-22T04:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.117469 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.117539 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.117557 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.117583 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.117600 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:38Z","lastTransitionTime":"2025-11-22T04:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.220565 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.220646 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.220670 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.220701 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.220725 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:38Z","lastTransitionTime":"2025-11-22T04:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.324181 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.324233 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.324245 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.324272 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.324287 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:38Z","lastTransitionTime":"2025-11-22T04:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.427408 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.427465 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.427474 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.427491 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.427503 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:38Z","lastTransitionTime":"2025-11-22T04:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.447274 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.447361 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:08:38 crc kubenswrapper[4699]: E1122 04:08:38.447474 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.447360 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:08:38 crc kubenswrapper[4699]: E1122 04:08:38.447600 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:08:38 crc kubenswrapper[4699]: E1122 04:08:38.447690 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.530822 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.530902 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.530921 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.530947 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.530964 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:38Z","lastTransitionTime":"2025-11-22T04:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.634217 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.634265 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.634275 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.634302 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.634313 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:38Z","lastTransitionTime":"2025-11-22T04:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.736461 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.736520 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.736537 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.736563 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.736581 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:38Z","lastTransitionTime":"2025-11-22T04:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.839039 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.839091 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.839107 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.839129 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.839142 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:38Z","lastTransitionTime":"2025-11-22T04:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.941915 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.941954 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.941968 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.941989 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:38 crc kubenswrapper[4699]: I1122 04:08:38.942003 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:38Z","lastTransitionTime":"2025-11-22T04:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.044820 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.044897 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.044915 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.044939 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.044958 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:39Z","lastTransitionTime":"2025-11-22T04:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.148093 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.148183 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.148213 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.148246 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.148271 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:39Z","lastTransitionTime":"2025-11-22T04:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.251293 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.251346 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.251360 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.251381 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.251395 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:39Z","lastTransitionTime":"2025-11-22T04:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.354006 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.354053 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.354061 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.354078 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.354094 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:39Z","lastTransitionTime":"2025-11-22T04:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.447331 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:08:39 crc kubenswrapper[4699]: E1122 04:08:39.447855 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.456857 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.457048 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.457104 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.457163 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.457224 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:39Z","lastTransitionTime":"2025-11-22T04:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.463560 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bdbae2-706a-4f84-9f56-5a42aec77762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc56d58ec38fe2e6ff34afa44193fd165159799c6184b7f1474c8b13087f257f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191befb5ec1036276709a4720f3cd8c40d63d14818bed55c5fac998489233619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kjwnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:39Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.481122 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b7225" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e5e536a-6797-4e6f-8160-1e23ddda1647\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07e7b4e6ae273aa9999ce9d0f198b8a9317611f11ddb313258aed23e3feff339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b7225\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:39Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.500207 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653394-4b4d-4c44-bc9d-39f2eeadbee4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e08c778826ca87eedf7169382d30509a5d31e132f5c91ff2cf633a24e3a7dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb226d8acfbc46b2a51a6c4ef5c04c1e17d99e9e82bad5950ccb4356fcc39eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c1d8b6512002b090f6fa191cc3dc7d55aeae6d135bca5df2c367fb2a4f68c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc4bf8d58b05d0044acc289a36a4eb6a4de51d5d0643239ff81fd7faff4531d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 04:07:50.127900 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 04:07:50.128059 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:07:50.128926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2923111326/tls.crt::/tmp/serving-cert-2923111326/tls.key\\\\\\\"\\\\nI1122 04:07:50.418529 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:07:50.432499 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:07:50.432593 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:07:50.432650 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:07:50.432686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:07:50.439773 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1122 04:07:50.439810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 04:07:50.439829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439834 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:07:50.439842 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:07:50.439844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:07:50.439864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:07:50.442112 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e25f8f28cc3aca76ae535aa6084bd1f994cbd0eb679f6ea40938a7fe456b0e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:39Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.513582 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:39Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.527000 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c43ee45b5065b7baee9b0025b5a73b4915b4577169a35be4378acf0e7cb603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:39Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.539684 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h6ndp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd066499-5bd5-459c-8a02-d02f716c8965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822e0ef5b78e9c1b19b56d52c7eed8ad0058cc30b405b2adf0e2a572afdaab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hhkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h6ndp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:39Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.552644 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fe0275b-9174-4aab-9f0f-7c00a233de69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3141a4a35fe91db661f1bbb69f481d1db9302e79a16e9bc2898f2fd5fbe0f445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e418cb4f331bd30b224110514a5d766e31fd949210ed6eb5ea3e1e04b2f62d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb5d783b1e21eb55efe9affd3962651d2bc2f2345954fa40a00e5f9b481066fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6860d3b5c86b1ad3bd55fc98a44e7fd84d66a5237df59f47319f598420b0241f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6860d3b5c86b1ad3bd55fc98a44e7fd84d66a5237df59f47319f598420b0241f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:39Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.559584 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.559635 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.559652 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.559672 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.559683 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:39Z","lastTransitionTime":"2025-11-22T04:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.565493 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c858c4eaa869f479d0fbd62eadd41218ca8dddc7ae5ffd82d36977acde2e76ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:39Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.578276 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:39Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.592081 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pmtb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5af0f83551d8cf679ee04fbc3995afe66769f74480211fb104ebf2d6d0f9ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccx9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pmtb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:39Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.609856 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:39Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.624318 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bfcbb63b703f8f023d54028af9011b37da8d2f7c9ac57e35129cd783f301876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99bfafe09aabfb9e3715d3c7af12849e0c8cb66e5799011c8463c5043383fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:39Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.636693 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pj52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82be5d0c-6f95-43e4-aa3c-9c56de3e200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:08:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pj52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:39Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.650658 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e855881-4d77-4655-b4d7-a50fc081f993\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545a27e66130160ef1d8557458a64a27f18292c157e2e6dab9aa75aea0532ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35e9c8adb3bd9249f6d7e57cd40e40951af0463e49765ba635707120d07e8b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1538d20749062691aa2368004d22a46e612186aee24cb92acc3ddb073f616a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4a053080810e22083dda4eaba1155b7b547a214158f849f7e5778f2e37ccc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:39Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.665550 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.665633 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.665646 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.665667 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.665682 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:39Z","lastTransitionTime":"2025-11-22T04:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.670858 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b193b41e-aa0e-4816-b965-7b7873dadf85\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd4757f265f2b7a453efca645d83d5340e5ec206f6f9d40dd86010b90470498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1996517d6f55ae1765dd9d101fede2963e7ac51a406bca35cab95fa45192623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59408c7cd75594e068cdc4dadfec414fcc3d1604eea37ed708440fd1a4f019ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://516e9231111cee4a53c71bef07338222497c8ffb27edbfaddbcb2e58af61ae7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2097cbd81d5aedb02fafaae3f17840da75ab455e541c410ae2f70710548530ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:39Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.685868 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86ztb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d15248-9724-41b0-8370-66127cc18bbe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08e180e0857112708a5ca84fc45cd41b9aebc5eef5628d5666abc590d86242e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-799vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86ztb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:39Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.705673 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa68142f0ff1c2e1bd7c2534395b616a4b68c5e8dc9d16c6d10709b1ed3d8455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa68142f0ff1c2e1bd7c2534395b616a4b68c5e8dc9d16c6d10709b1ed3d8455\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:08:30Z\\\",\\\"message\\\":\\\" 6438 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1122 04:08:30.300232 6438 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1122 04:08:30.300243 6438 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1122 04:08:30.300257 6438 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1122 04:08:30.300270 6438 factory.go:656] Stopping watch factory\\\\nI1122 04:08:30.300285 6438 ovnkube.go:599] Stopped ovnkube\\\\nI1122 04:08:30.300330 6438 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 04:08:30.300345 6438 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 04:08:30.300351 6438 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1122 04:08:30.300356 6438 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1122 04:08:30.300361 6438 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1122 04:08:30.300366 6438 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1122 04:08:30.300372 6438 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1122 04:08:30.300377 6438 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1122 04:08:30.300383 6438 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1122 04:08:30.300391 6438 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1122 04:08:30.300476 6438 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:08:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-z7552_openshift-ovn-kubernetes(fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z7552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:39Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.720894 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gqt5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686f15a0-53ce-4d3f-80e2-7d6272dc7d4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5501c17b8d8e321c7b94254ed053f943531df548575931c4ec091997d68572a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxlj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cc1c0cd69753ab441348667255f1dc34d4eae5c0579a0f84eb5d6063f7970d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxlj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:08:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gqt5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:39Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.768197 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.768253 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.768266 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.768288 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.768301 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:39Z","lastTransitionTime":"2025-11-22T04:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.871344 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.871408 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.871449 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.871473 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.871485 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:39Z","lastTransitionTime":"2025-11-22T04:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.974187 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.974251 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.974265 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.974287 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.974300 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:39Z","lastTransitionTime":"2025-11-22T04:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.997352 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pmtb4_c5f530d5-6f69-4838-a0dd-f4662ddbf85c/kube-multus/0.log" Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.997459 4699 generic.go:334] "Generic (PLEG): container finished" podID="c5f530d5-6f69-4838-a0dd-f4662ddbf85c" containerID="f5af0f83551d8cf679ee04fbc3995afe66769f74480211fb104ebf2d6d0f9ab8" exitCode=1 Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.997498 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pmtb4" event={"ID":"c5f530d5-6f69-4838-a0dd-f4662ddbf85c","Type":"ContainerDied","Data":"f5af0f83551d8cf679ee04fbc3995afe66769f74480211fb104ebf2d6d0f9ab8"} Nov 22 04:08:39 crc kubenswrapper[4699]: I1122 04:08:39.997944 4699 scope.go:117] "RemoveContainer" containerID="f5af0f83551d8cf679ee04fbc3995afe66769f74480211fb104ebf2d6d0f9ab8" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.015809 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bdbae2-706a-4f84-9f56-5a42aec77762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc56d58ec38fe2e6ff34afa44193fd165159799c6184b7f1474c8b13087f257f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191befb5ec1036276709a4720f3cd8c40d63d14818bed55c5fac998489233619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kjwnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:40Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.032821 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b7225" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e5e536a-6797-4e6f-8160-1e23ddda1647\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07e7b4e6ae273aa9999ce9d0f198b8a9317611f11ddb313258aed23e3feff339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b7225\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:40Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.049575 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653394-4b4d-4c44-bc9d-39f2eeadbee4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e08c778826ca87eedf7169382d30509a5d31e132f5c91ff2cf633a24e3a7dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb226d8acfbc46b2a51a6c4ef5c04c1e17d99e9e82bad5950ccb4356fcc39eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c1d8b6512002b090f6fa191cc3dc7d55aeae6d135bca5df2c367fb2a4f68c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc4bf8d58b05d0044acc289a36a4eb6a4de51d5d0643239ff81fd7faff4531d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 04:07:50.127900 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 04:07:50.128059 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:07:50.128926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2923111326/tls.crt::/tmp/serving-cert-2923111326/tls.key\\\\\\\"\\\\nI1122 04:07:50.418529 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:07:50.432499 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:07:50.432593 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:07:50.432650 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:07:50.432686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:07:50.439773 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1122 04:07:50.439810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 04:07:50.439829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439834 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:07:50.439842 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:07:50.439844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:07:50.439864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:07:50.442112 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e25f8f28cc3aca76ae535aa6084bd1f994cbd0eb679f6ea40938a7fe456b0e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:40Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.064687 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:40Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.077979 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.078039 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.078058 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.078082 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.078099 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:40Z","lastTransitionTime":"2025-11-22T04:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.081531 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c43ee45b5065b7baee9b0025b5a73b4915b4577169a35be4378acf0e7cb603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:40Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.102048 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h6ndp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd066499-5bd5-459c-8a02-d02f716c8965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822e0ef5b78e9c1b19b56d52c7eed8ad0058cc30b405b2adf0e2a572afdaab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hhkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h6ndp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:40Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.115748 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fe0275b-9174-4aab-9f0f-7c00a233de69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3141a4a35fe91db661f1bbb69f481d1db9302e79a16e9bc2898f2fd5fbe0f445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e418cb4f331bd30b224110514a5d766e31fd949210ed6eb5ea3e1e04b2f62d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb5d783b1e21eb55efe9affd3962651d2bc2f2345954fa40a00e5f9b481066fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6860d3b5c86b1ad3bd55fc98a44e7fd84d66a5237df59f47319f598420b0241f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6860d3b5c86b1ad3bd55fc98a44e7fd84d66a5237df59f47319f598420b0241f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:40Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.135358 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c858c4eaa869f479d0fbd62eadd41218ca8dddc7ae5ffd82d36977acde2e76ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:40Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.151212 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:40Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.165298 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pmtb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5af0f83551d8cf679ee04fbc3995afe66769f74480211fb104ebf2d6d0f9ab8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5af0f83551d8cf679ee04fbc3995afe66769f74480211fb104ebf2d6d0f9ab8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:08:39Z\\\",\\\"message\\\":\\\"2025-11-22T04:07:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_60dfe7ea-f5eb-4363-a49b-b3c5f3ab720c\\\\n2025-11-22T04:07:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_60dfe7ea-f5eb-4363-a49b-b3c5f3ab720c to /host/opt/cni/bin/\\\\n2025-11-22T04:07:54Z [verbose] multus-daemon started\\\\n2025-11-22T04:07:54Z [verbose] Readiness Indicator file check\\\\n2025-11-22T04:08:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccx9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pmtb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:40Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.178279 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:40Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.182704 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.182761 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.182778 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.182797 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.182810 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:40Z","lastTransitionTime":"2025-11-22T04:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.192945 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bfcbb63b703f8f023d54028af9011b37da8d2f7c9ac57e35129cd783f301876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99bfafe09aabfb9e3715d3c7af12849e0c8cb66e5799011c8463c5043383fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:40Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.203547 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pj52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82be5d0c-6f95-43e4-aa3c-9c56de3e200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:08:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pj52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:40Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.219036 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e855881-4d77-4655-b4d7-a50fc081f993\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545a27e66130160ef1d8557458a64a27f18292c157e2e6dab9aa75aea0532ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35e9c8adb3bd9249f6d7e57cd40e40951af0463e49765ba635707120d07e8b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1538d20749062691aa2368004d22a46e612186aee24cb92acc3ddb073f616a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4a053080810e22083dda4eaba1155b7b547a214158f849f7e5778f2e37ccc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:40Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.239693 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b193b41e-aa0e-4816-b965-7b7873dadf85\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd4757f265f2b7a453efca645d83d5340e5ec206f6f9d40dd86010b90470498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1996517d6f55ae1765dd9d101fede2963e7ac51a406bca35cab95fa45192623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59408c7cd75594e068cdc4dadfec414fcc3d1604eea37ed708440fd1a4f019ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://516e9231111cee4a53c71bef07338222497c8ffb27edbfaddbcb2e58af61ae7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2097cbd81d5aedb02fafaae3f17840da75ab455e541c410ae2f70710548530ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:40Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.254670 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86ztb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d15248-9724-41b0-8370-66127cc18bbe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08e180e0857112708a5ca84fc45cd41b9aebc5eef5628d5666abc590d86242e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-799vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86ztb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:40Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.281920 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa68142f0ff1c2e1bd7c2534395b616a4b68c5e8dc9d16c6d10709b1ed3d8455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa68142f0ff1c2e1bd7c2534395b616a4b68c5e8dc9d16c6d10709b1ed3d8455\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:08:30Z\\\",\\\"message\\\":\\\" 6438 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1122 04:08:30.300232 6438 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1122 04:08:30.300243 6438 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1122 04:08:30.300257 6438 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1122 04:08:30.300270 6438 factory.go:656] Stopping watch factory\\\\nI1122 04:08:30.300285 6438 ovnkube.go:599] Stopped ovnkube\\\\nI1122 04:08:30.300330 6438 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 04:08:30.300345 6438 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 04:08:30.300351 6438 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1122 04:08:30.300356 6438 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1122 04:08:30.300361 6438 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1122 04:08:30.300366 6438 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1122 04:08:30.300372 6438 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1122 04:08:30.300377 6438 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1122 04:08:30.300383 6438 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1122 04:08:30.300391 6438 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1122 04:08:30.300476 6438 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:08:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-z7552_openshift-ovn-kubernetes(fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z7552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:40Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.285336 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.285595 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.285685 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.285780 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.285870 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:40Z","lastTransitionTime":"2025-11-22T04:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.296732 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gqt5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686f15a0-53ce-4d3f-80e2-7d6272dc7d4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5501c17b8d8e321c7b94254ed053f943531df548575931c4ec091997d68572a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxlj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cc1c0cd69753ab441348667255f1dc34d4eae5c0579a0f84eb5d6063f7970d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxlj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:08:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gqt5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:40Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.388211 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.388255 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.388265 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.388284 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.388293 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:40Z","lastTransitionTime":"2025-11-22T04:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.448102 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.448102 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.448137 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:08:40 crc kubenswrapper[4699]: E1122 04:08:40.448670 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:08:40 crc kubenswrapper[4699]: E1122 04:08:40.448837 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:08:40 crc kubenswrapper[4699]: E1122 04:08:40.448783 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.498145 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.498518 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.498626 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.498711 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.498800 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:40Z","lastTransitionTime":"2025-11-22T04:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.601719 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.602115 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.602188 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.602286 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.602347 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:40Z","lastTransitionTime":"2025-11-22T04:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.705741 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.705810 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.705823 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.705845 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.705860 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:40Z","lastTransitionTime":"2025-11-22T04:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.809815 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.810281 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.810494 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.810680 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.810866 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:40Z","lastTransitionTime":"2025-11-22T04:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.913686 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.913955 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.914092 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.914205 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:40 crc kubenswrapper[4699]: I1122 04:08:40.914319 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:40Z","lastTransitionTime":"2025-11-22T04:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.004058 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pmtb4_c5f530d5-6f69-4838-a0dd-f4662ddbf85c/kube-multus/0.log" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.004363 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pmtb4" event={"ID":"c5f530d5-6f69-4838-a0dd-f4662ddbf85c","Type":"ContainerStarted","Data":"b3db78d8652d86af236e2b210210af39f3c90f31425810390e79391e581d0cf9"} Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.017319 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.017351 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.017361 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.017375 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.017384 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:41Z","lastTransitionTime":"2025-11-22T04:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.020464 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:41Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.037919 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pmtb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3db78d8652d86af236e2b210210af39f3c90f31425810390e79391e581d0cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5af0f83551d8cf679ee04fbc3995afe66769f74480211fb104ebf2d6d0f9ab8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:08:39Z\\\",\\\"message\\\":\\\"2025-11-22T04:07:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_60dfe7ea-f5eb-4363-a49b-b3c5f3ab720c\\\\n2025-11-22T04:07:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_60dfe7ea-f5eb-4363-a49b-b3c5f3ab720c to /host/opt/cni/bin/\\\\n2025-11-22T04:07:54Z [verbose] multus-daemon started\\\\n2025-11-22T04:07:54Z [verbose] Readiness Indicator file check\\\\n2025-11-22T04:08:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccx9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pmtb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:41Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.050707 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fe0275b-9174-4aab-9f0f-7c00a233de69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3141a4a35fe91db661f1bbb69f481d1db9302e79a16e9bc2898f2fd5fbe0f445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e418cb4f331bd30b224110514a5d766e31fd949210ed6eb5ea3e1e04b2f62d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb5d783b1e21eb55efe9affd3962651d2bc2f2345954fa40a00e5f9b481066fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6860d3b5c86b1ad3bd55fc98a44e7fd84d66a5237df59f47319f598420b0241f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6860d3b5c86b1ad3bd55fc98a44e7fd84d66a5237df59f47319f598420b0241f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:41Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.065565 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c858c4eaa869f479d0fbd62eadd41218ca8dddc7ae5ffd82d36977acde2e76ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:41Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.078817 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pj52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82be5d0c-6f95-43e4-aa3c-9c56de3e200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:08:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pj52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:41Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.094523 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:41Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.110153 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bfcbb63b703f8f023d54028af9011b37da8d2f7c9ac57e35129cd783f301876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99bfafe09aabfb9e3715d3c7af12849e0c8cb66e5799011c8463c5043383fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:41Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.120471 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.120506 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.120516 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.120531 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.120542 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:41Z","lastTransitionTime":"2025-11-22T04:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.121483 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86ztb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d15248-9724-41b0-8370-66127cc18bbe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08e180e0857112708a5ca84fc45cd41b9aebc5eef5628d5666abc590d86242e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-799vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86ztb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:41Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.144383 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa68142f0ff1c2e1bd7c2534395b616a4b68c5e8dc9d16c6d10709b1ed3d8455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa68142f0ff1c2e1bd7c2534395b616a4b68c5e8dc9d16c6d10709b1ed3d8455\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:08:30Z\\\",\\\"message\\\":\\\" 6438 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1122 04:08:30.300232 6438 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1122 04:08:30.300243 6438 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1122 04:08:30.300257 6438 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1122 04:08:30.300270 6438 factory.go:656] Stopping watch factory\\\\nI1122 04:08:30.300285 6438 ovnkube.go:599] Stopped ovnkube\\\\nI1122 04:08:30.300330 6438 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 04:08:30.300345 6438 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 04:08:30.300351 6438 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1122 04:08:30.300356 6438 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1122 04:08:30.300361 6438 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1122 04:08:30.300366 6438 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1122 04:08:30.300372 6438 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1122 04:08:30.300377 6438 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1122 04:08:30.300383 6438 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1122 04:08:30.300391 6438 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1122 04:08:30.300476 6438 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:08:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-z7552_openshift-ovn-kubernetes(fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z7552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:41Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.158812 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gqt5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686f15a0-53ce-4d3f-80e2-7d6272dc7d4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5501c17b8d8e321c7b94254ed053f943531df548575931c4ec091997d68572a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxlj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cc1c0cd69753ab441348667255f1dc34d4eae5c0579a0f84eb5d6063f7970d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxlj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:08:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gqt5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:41Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.171360 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e855881-4d77-4655-b4d7-a50fc081f993\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545a27e66130160ef1d8557458a64a27f18292c157e2e6dab9aa75aea0532ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35e9c8adb3bd9249f6d7e57cd40e40951af0463e49765ba635707120d07e8b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1538d20749062691aa2368004d22a46e612186aee24cb92acc3ddb073f616a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4a053080810e22083dda4eaba1155b7b547a214158f849f7e5778f2e37ccc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:41Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.194863 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b193b41e-aa0e-4816-b965-7b7873dadf85\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd4757f265f2b7a453efca645d83d5340e5ec206f6f9d40dd86010b90470498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1996517d6f55ae1765dd9d101fede2963e7ac51a406bca35cab95fa45192623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59408c7cd75594e068cdc4dadfec414fcc3d1604eea37ed708440fd1a4f019ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://516e9231111cee4a53c71bef07338222497c8ffb27edbfaddbcb2e58af61ae7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2097cbd81d5aedb02fafaae3f17840da75ab455e541c410ae2f70710548530ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:41Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.210723 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:41Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.223351 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.223405 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.223414 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.223445 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.223458 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:41Z","lastTransitionTime":"2025-11-22T04:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.224418 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c43ee45b5065b7baee9b0025b5a73b4915b4577169a35be4378acf0e7cb603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:41Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.236068 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h6ndp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd066499-5bd5-459c-8a02-d02f716c8965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822e0ef5b78e9c1b19b56d52c7eed8ad0058cc30b405b2adf0e2a572afdaab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hhkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h6ndp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:41Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.250510 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bdbae2-706a-4f84-9f56-5a42aec77762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc56d58ec38fe2e6ff34afa44193fd165159799c6184b7f1474c8b13087f257f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191befb5ec1036276709a4720f3cd8c40d63d14818bed55c5fac998489233619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kjwnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:41Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.268641 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b7225" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e5e536a-6797-4e6f-8160-1e23ddda1647\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07e7b4e6ae273aa9999ce9d0f198b8a9317611f11ddb313258aed23e3feff339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b7225\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:41Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.286187 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653394-4b4d-4c44-bc9d-39f2eeadbee4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e08c778826ca87eedf7169382d30509a5d31e132f5c91ff2cf633a24e3a7dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb226d8acfbc46b2a51a6c4ef5c04c1e17d99e9e82bad5950ccb4356fcc39eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c1d8b6512002b090f6fa191cc3dc7d55aeae6d135bca5df2c367fb2a4f68c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc4bf8d58b05d0044acc289a36a4eb6a4de51d5d0643239ff81fd7faff4531d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 04:07:50.127900 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 04:07:50.128059 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:07:50.128926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2923111326/tls.crt::/tmp/serving-cert-2923111326/tls.key\\\\\\\"\\\\nI1122 04:07:50.418529 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:07:50.432499 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:07:50.432593 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:07:50.432650 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:07:50.432686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:07:50.439773 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1122 04:07:50.439810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 04:07:50.439829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439834 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:07:50.439842 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:07:50.439844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:07:50.439864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:07:50.442112 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e25f8f28cc3aca76ae535aa6084bd1f994cbd0eb679f6ea40938a7fe456b0e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:41Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.326975 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.327019 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.327030 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.327046 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.327057 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:41Z","lastTransitionTime":"2025-11-22T04:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.430585 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.430638 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.430650 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.430667 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.430677 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:41Z","lastTransitionTime":"2025-11-22T04:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.446950 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:08:41 crc kubenswrapper[4699]: E1122 04:08:41.447123 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.533312 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.533356 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.533366 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.533385 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.533397 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:41Z","lastTransitionTime":"2025-11-22T04:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.636635 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.636689 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.636703 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.636722 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.636733 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:41Z","lastTransitionTime":"2025-11-22T04:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.739646 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.739716 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.739734 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.739793 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.739814 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:41Z","lastTransitionTime":"2025-11-22T04:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.842977 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.843022 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.843031 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.843050 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.843061 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:41Z","lastTransitionTime":"2025-11-22T04:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.946084 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.946165 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.946186 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.946216 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:41 crc kubenswrapper[4699]: I1122 04:08:41.946237 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:41Z","lastTransitionTime":"2025-11-22T04:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.049889 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.049963 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.049990 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.050020 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.050047 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:42Z","lastTransitionTime":"2025-11-22T04:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.154135 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.154188 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.154205 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.154224 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.154236 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:42Z","lastTransitionTime":"2025-11-22T04:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.256830 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.256904 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.256933 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.256967 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.256986 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:42Z","lastTransitionTime":"2025-11-22T04:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.359883 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.359951 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.359965 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.359986 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.360001 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:42Z","lastTransitionTime":"2025-11-22T04:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.447201 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.447323 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:08:42 crc kubenswrapper[4699]: E1122 04:08:42.447822 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.447424 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:08:42 crc kubenswrapper[4699]: E1122 04:08:42.448080 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:08:42 crc kubenswrapper[4699]: E1122 04:08:42.447896 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.462949 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.463138 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.463234 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.463304 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.463372 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:42Z","lastTransitionTime":"2025-11-22T04:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.566322 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.566364 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.566373 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.566388 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.566399 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:42Z","lastTransitionTime":"2025-11-22T04:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.669060 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.669121 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.669134 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.669158 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.669172 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:42Z","lastTransitionTime":"2025-11-22T04:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.772279 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.772341 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.772356 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.772376 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.772394 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:42Z","lastTransitionTime":"2025-11-22T04:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.875853 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.875901 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.875912 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.875929 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.875940 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:42Z","lastTransitionTime":"2025-11-22T04:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.977882 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.977933 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.977942 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.977958 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:42 crc kubenswrapper[4699]: I1122 04:08:42.977967 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:42Z","lastTransitionTime":"2025-11-22T04:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:43 crc kubenswrapper[4699]: I1122 04:08:43.080836 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:43 crc kubenswrapper[4699]: I1122 04:08:43.080973 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:43 crc kubenswrapper[4699]: I1122 04:08:43.080995 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:43 crc kubenswrapper[4699]: I1122 04:08:43.081020 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:43 crc kubenswrapper[4699]: I1122 04:08:43.081041 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:43Z","lastTransitionTime":"2025-11-22T04:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:43 crc kubenswrapper[4699]: I1122 04:08:43.183219 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:43 crc kubenswrapper[4699]: I1122 04:08:43.183249 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:43 crc kubenswrapper[4699]: I1122 04:08:43.183257 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:43 crc kubenswrapper[4699]: I1122 04:08:43.183271 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:43 crc kubenswrapper[4699]: I1122 04:08:43.183280 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:43Z","lastTransitionTime":"2025-11-22T04:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:43 crc kubenswrapper[4699]: I1122 04:08:43.285851 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:43 crc kubenswrapper[4699]: I1122 04:08:43.285883 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:43 crc kubenswrapper[4699]: I1122 04:08:43.285901 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:43 crc kubenswrapper[4699]: I1122 04:08:43.285916 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:43 crc kubenswrapper[4699]: I1122 04:08:43.285924 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:43Z","lastTransitionTime":"2025-11-22T04:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:43 crc kubenswrapper[4699]: I1122 04:08:43.387999 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:43 crc kubenswrapper[4699]: I1122 04:08:43.388039 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:43 crc kubenswrapper[4699]: I1122 04:08:43.388048 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:43 crc kubenswrapper[4699]: I1122 04:08:43.388062 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:43 crc kubenswrapper[4699]: I1122 04:08:43.388074 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:43Z","lastTransitionTime":"2025-11-22T04:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:43 crc kubenswrapper[4699]: I1122 04:08:43.447101 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:08:43 crc kubenswrapper[4699]: E1122 04:08:43.447360 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:08:43 crc kubenswrapper[4699]: I1122 04:08:43.490145 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:43 crc kubenswrapper[4699]: I1122 04:08:43.490187 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:43 crc kubenswrapper[4699]: I1122 04:08:43.490198 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:43 crc kubenswrapper[4699]: I1122 04:08:43.490218 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:43 crc kubenswrapper[4699]: I1122 04:08:43.490230 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:43Z","lastTransitionTime":"2025-11-22T04:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:43 crc kubenswrapper[4699]: I1122 04:08:43.592426 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:43 crc kubenswrapper[4699]: I1122 04:08:43.592478 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:43 crc kubenswrapper[4699]: I1122 04:08:43.592488 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:43 crc kubenswrapper[4699]: I1122 04:08:43.592502 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:43 crc kubenswrapper[4699]: I1122 04:08:43.592510 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:43Z","lastTransitionTime":"2025-11-22T04:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:43 crc kubenswrapper[4699]: I1122 04:08:43.695282 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:43 crc kubenswrapper[4699]: I1122 04:08:43.695356 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:43 crc kubenswrapper[4699]: I1122 04:08:43.695374 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:43 crc kubenswrapper[4699]: I1122 04:08:43.695399 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:43 crc kubenswrapper[4699]: I1122 04:08:43.695416 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:43Z","lastTransitionTime":"2025-11-22T04:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:43 crc kubenswrapper[4699]: I1122 04:08:43.798522 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:43 crc kubenswrapper[4699]: I1122 04:08:43.798565 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:43 crc kubenswrapper[4699]: I1122 04:08:43.798577 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:43 crc kubenswrapper[4699]: I1122 04:08:43.798595 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:43 crc kubenswrapper[4699]: I1122 04:08:43.798607 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:43Z","lastTransitionTime":"2025-11-22T04:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:43 crc kubenswrapper[4699]: I1122 04:08:43.900657 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:43 crc kubenswrapper[4699]: I1122 04:08:43.900700 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:43 crc kubenswrapper[4699]: I1122 04:08:43.900712 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:43 crc kubenswrapper[4699]: I1122 04:08:43.900730 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:43 crc kubenswrapper[4699]: I1122 04:08:43.900743 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:43Z","lastTransitionTime":"2025-11-22T04:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.004050 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.004519 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.004681 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.004789 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.004871 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:44Z","lastTransitionTime":"2025-11-22T04:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.108334 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.108390 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.108407 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.108494 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.108516 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:44Z","lastTransitionTime":"2025-11-22T04:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.212351 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.212396 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.212406 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.212423 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.212455 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:44Z","lastTransitionTime":"2025-11-22T04:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.315058 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.315115 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.315130 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.315149 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.315165 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:44Z","lastTransitionTime":"2025-11-22T04:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.418320 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.418365 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.418374 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.418393 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.418404 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:44Z","lastTransitionTime":"2025-11-22T04:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.447201 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.447201 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:08:44 crc kubenswrapper[4699]: E1122 04:08:44.447330 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:08:44 crc kubenswrapper[4699]: E1122 04:08:44.447523 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.447829 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:08:44 crc kubenswrapper[4699]: E1122 04:08:44.448195 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.520753 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.520794 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.520805 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.520822 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.520836 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:44Z","lastTransitionTime":"2025-11-22T04:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.623831 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.623898 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.623912 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.623936 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.623957 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:44Z","lastTransitionTime":"2025-11-22T04:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.726572 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.727005 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.727088 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.727158 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.727227 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:44Z","lastTransitionTime":"2025-11-22T04:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.830119 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.830499 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.830590 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.830661 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.830723 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:44Z","lastTransitionTime":"2025-11-22T04:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.933413 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.933503 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.933517 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.933536 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:44 crc kubenswrapper[4699]: I1122 04:08:44.933550 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:44Z","lastTransitionTime":"2025-11-22T04:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.037603 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.037898 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.038073 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.038182 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.038308 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:45Z","lastTransitionTime":"2025-11-22T04:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.141556 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.141636 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.141666 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.141704 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.141730 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:45Z","lastTransitionTime":"2025-11-22T04:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.244797 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.244863 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.244880 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.244908 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.244929 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:45Z","lastTransitionTime":"2025-11-22T04:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.347923 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.348008 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.348030 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.348054 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.348072 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:45Z","lastTransitionTime":"2025-11-22T04:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.448128 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:08:45 crc kubenswrapper[4699]: E1122 04:08:45.448289 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.450682 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.450731 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.450747 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.450770 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.450786 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:45Z","lastTransitionTime":"2025-11-22T04:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.553301 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.553375 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.553397 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.553422 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.553479 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:45Z","lastTransitionTime":"2025-11-22T04:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.656301 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.656356 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.656378 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.656409 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.656504 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:45Z","lastTransitionTime":"2025-11-22T04:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.760723 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.760792 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.760815 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.760846 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.760902 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:45Z","lastTransitionTime":"2025-11-22T04:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.864004 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.864116 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.864134 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.864161 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.864178 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:45Z","lastTransitionTime":"2025-11-22T04:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.966775 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.966809 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.966820 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.966840 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:45 crc kubenswrapper[4699]: I1122 04:08:45.966851 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:45Z","lastTransitionTime":"2025-11-22T04:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:46 crc kubenswrapper[4699]: I1122 04:08:46.069902 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:46 crc kubenswrapper[4699]: I1122 04:08:46.069973 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:46 crc kubenswrapper[4699]: I1122 04:08:46.069997 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:46 crc kubenswrapper[4699]: I1122 04:08:46.070033 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:46 crc kubenswrapper[4699]: I1122 04:08:46.070058 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:46Z","lastTransitionTime":"2025-11-22T04:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:46 crc kubenswrapper[4699]: I1122 04:08:46.174113 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:46 crc kubenswrapper[4699]: I1122 04:08:46.174188 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:46 crc kubenswrapper[4699]: I1122 04:08:46.174200 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:46 crc kubenswrapper[4699]: I1122 04:08:46.174239 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:46 crc kubenswrapper[4699]: I1122 04:08:46.174261 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:46Z","lastTransitionTime":"2025-11-22T04:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:46 crc kubenswrapper[4699]: I1122 04:08:46.277749 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:46 crc kubenswrapper[4699]: I1122 04:08:46.277823 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:46 crc kubenswrapper[4699]: I1122 04:08:46.277848 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:46 crc kubenswrapper[4699]: I1122 04:08:46.277884 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:46 crc kubenswrapper[4699]: I1122 04:08:46.277909 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:46Z","lastTransitionTime":"2025-11-22T04:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:46 crc kubenswrapper[4699]: I1122 04:08:46.380502 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:46 crc kubenswrapper[4699]: I1122 04:08:46.380775 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:46 crc kubenswrapper[4699]: I1122 04:08:46.380807 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:46 crc kubenswrapper[4699]: I1122 04:08:46.380842 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:46 crc kubenswrapper[4699]: I1122 04:08:46.380866 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:46Z","lastTransitionTime":"2025-11-22T04:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:46 crc kubenswrapper[4699]: I1122 04:08:46.447843 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:08:46 crc kubenswrapper[4699]: I1122 04:08:46.447945 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:08:46 crc kubenswrapper[4699]: I1122 04:08:46.447973 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:08:46 crc kubenswrapper[4699]: E1122 04:08:46.448459 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:08:46 crc kubenswrapper[4699]: E1122 04:08:46.448698 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:08:46 crc kubenswrapper[4699]: I1122 04:08:46.448805 4699 scope.go:117] "RemoveContainer" containerID="aa68142f0ff1c2e1bd7c2534395b616a4b68c5e8dc9d16c6d10709b1ed3d8455" Nov 22 04:08:46 crc kubenswrapper[4699]: E1122 04:08:46.448848 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:08:46 crc kubenswrapper[4699]: E1122 04:08:46.449227 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-z7552_openshift-ovn-kubernetes(fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" Nov 22 04:08:46 crc kubenswrapper[4699]: I1122 04:08:46.484423 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:46 crc kubenswrapper[4699]: I1122 04:08:46.484489 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:46 crc kubenswrapper[4699]: I1122 04:08:46.484501 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:46 crc kubenswrapper[4699]: I1122 04:08:46.484523 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:46 crc kubenswrapper[4699]: I1122 04:08:46.484536 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:46Z","lastTransitionTime":"2025-11-22T04:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:46 crc kubenswrapper[4699]: I1122 04:08:46.587253 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:46 crc kubenswrapper[4699]: I1122 04:08:46.587319 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:46 crc kubenswrapper[4699]: I1122 04:08:46.587341 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:46 crc kubenswrapper[4699]: I1122 04:08:46.587366 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:46 crc kubenswrapper[4699]: I1122 04:08:46.587388 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:46Z","lastTransitionTime":"2025-11-22T04:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:46 crc kubenswrapper[4699]: I1122 04:08:46.690370 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:46 crc kubenswrapper[4699]: I1122 04:08:46.690518 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:46 crc kubenswrapper[4699]: I1122 04:08:46.690550 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:46 crc kubenswrapper[4699]: I1122 04:08:46.690586 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:46 crc kubenswrapper[4699]: I1122 04:08:46.690611 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:46Z","lastTransitionTime":"2025-11-22T04:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:46 crc kubenswrapper[4699]: I1122 04:08:46.794379 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:46 crc kubenswrapper[4699]: I1122 04:08:46.794481 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:46 crc kubenswrapper[4699]: I1122 04:08:46.794501 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:46 crc kubenswrapper[4699]: I1122 04:08:46.794528 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:46 crc kubenswrapper[4699]: I1122 04:08:46.794547 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:46Z","lastTransitionTime":"2025-11-22T04:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:46 crc kubenswrapper[4699]: I1122 04:08:46.898609 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:46 crc kubenswrapper[4699]: I1122 04:08:46.898688 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:46 crc kubenswrapper[4699]: I1122 04:08:46.898712 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:46 crc kubenswrapper[4699]: I1122 04:08:46.898748 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:46 crc kubenswrapper[4699]: I1122 04:08:46.898772 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:46Z","lastTransitionTime":"2025-11-22T04:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.002176 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.002261 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.002283 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.002314 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.002331 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:47Z","lastTransitionTime":"2025-11-22T04:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.105528 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.105583 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.105595 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.105613 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.105628 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:47Z","lastTransitionTime":"2025-11-22T04:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.209113 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.209182 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.209197 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.209219 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.209234 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:47Z","lastTransitionTime":"2025-11-22T04:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.313336 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.313408 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.313464 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.313501 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.313524 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:47Z","lastTransitionTime":"2025-11-22T04:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.416934 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.417029 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.417128 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.417163 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.417189 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:47Z","lastTransitionTime":"2025-11-22T04:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.446981 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:08:47 crc kubenswrapper[4699]: E1122 04:08:47.447243 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.519563 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.519596 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.519604 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.519618 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.519628 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:47Z","lastTransitionTime":"2025-11-22T04:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.622864 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.622923 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.622939 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.622967 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.622985 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:47Z","lastTransitionTime":"2025-11-22T04:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.726156 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.726219 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.726235 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.726259 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.726276 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:47Z","lastTransitionTime":"2025-11-22T04:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.819377 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.819559 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.819582 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.819613 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.819631 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:47Z","lastTransitionTime":"2025-11-22T04:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:47 crc kubenswrapper[4699]: E1122 04:08:47.842670 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4852b328-c4f8-4280-9881-83927c94bf9a\\\",\\\"systemUUID\\\":\\\"76c96961-7d99-459e-9731-5ae805318244\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:47Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.848608 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.848666 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.848689 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.848720 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.848743 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:47Z","lastTransitionTime":"2025-11-22T04:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:47 crc kubenswrapper[4699]: E1122 04:08:47.870689 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4852b328-c4f8-4280-9881-83927c94bf9a\\\",\\\"systemUUID\\\":\\\"76c96961-7d99-459e-9731-5ae805318244\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:47Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.876123 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.876191 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.876208 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.876234 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.876252 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:47Z","lastTransitionTime":"2025-11-22T04:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:47 crc kubenswrapper[4699]: E1122 04:08:47.899094 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4852b328-c4f8-4280-9881-83927c94bf9a\\\",\\\"systemUUID\\\":\\\"76c96961-7d99-459e-9731-5ae805318244\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:47Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.904454 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.904520 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.904538 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.904564 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.904580 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:47Z","lastTransitionTime":"2025-11-22T04:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:47 crc kubenswrapper[4699]: E1122 04:08:47.924022 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4852b328-c4f8-4280-9881-83927c94bf9a\\\",\\\"systemUUID\\\":\\\"76c96961-7d99-459e-9731-5ae805318244\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:47Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.929508 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.929558 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.929574 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.929598 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.929619 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:47Z","lastTransitionTime":"2025-11-22T04:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:47 crc kubenswrapper[4699]: E1122 04:08:47.945765 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4852b328-c4f8-4280-9881-83927c94bf9a\\\",\\\"systemUUID\\\":\\\"76c96961-7d99-459e-9731-5ae805318244\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:47Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:47 crc kubenswrapper[4699]: E1122 04:08:47.945955 4699 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.947871 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.947909 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.947921 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.947938 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:47 crc kubenswrapper[4699]: I1122 04:08:47.947954 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:47Z","lastTransitionTime":"2025-11-22T04:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.051019 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.051084 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.051097 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.051120 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.051138 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:48Z","lastTransitionTime":"2025-11-22T04:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.154394 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.154500 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.154518 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.154541 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.154557 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:48Z","lastTransitionTime":"2025-11-22T04:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.257419 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.257529 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.257553 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.257586 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.257609 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:48Z","lastTransitionTime":"2025-11-22T04:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.365734 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.365861 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.365897 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.365952 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.365976 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:48Z","lastTransitionTime":"2025-11-22T04:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.446850 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.446913 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.446850 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:08:48 crc kubenswrapper[4699]: E1122 04:08:48.447101 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:08:48 crc kubenswrapper[4699]: E1122 04:08:48.447252 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:08:48 crc kubenswrapper[4699]: E1122 04:08:48.447380 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.469389 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.469502 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.469530 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.469555 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.469573 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:48Z","lastTransitionTime":"2025-11-22T04:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.572504 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.572571 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.572655 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.572686 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.572708 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:48Z","lastTransitionTime":"2025-11-22T04:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.675695 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.675806 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.675830 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.675855 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.675873 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:48Z","lastTransitionTime":"2025-11-22T04:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.778878 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.778960 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.778984 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.779016 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.779040 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:48Z","lastTransitionTime":"2025-11-22T04:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.883828 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.883874 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.883886 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.883904 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.883917 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:48Z","lastTransitionTime":"2025-11-22T04:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.988175 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.988222 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.988234 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.988254 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:48 crc kubenswrapper[4699]: I1122 04:08:48.988267 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:48Z","lastTransitionTime":"2025-11-22T04:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.091981 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.092061 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.092081 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.092108 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.092126 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:49Z","lastTransitionTime":"2025-11-22T04:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.195381 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.195462 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.195475 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.195495 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.195508 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:49Z","lastTransitionTime":"2025-11-22T04:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.299023 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.299089 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.299100 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.299120 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.299137 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:49Z","lastTransitionTime":"2025-11-22T04:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.402862 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.402921 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.402932 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.402950 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.402961 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:49Z","lastTransitionTime":"2025-11-22T04:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.447736 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:08:49 crc kubenswrapper[4699]: E1122 04:08:49.447943 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.466021 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:49Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.483733 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bfcbb63b703f8f023d54028af9011b37da8d2f7c9ac57e35129cd783f301876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99bfafe09aabfb9e3715d3c7af12849e0c8cb66e5799011c8463c5043383fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:49Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.496052 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pj52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82be5d0c-6f95-43e4-aa3c-9c56de3e200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:08:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pj52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:49Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.505296 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.505338 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.505349 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.505366 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.505376 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:49Z","lastTransitionTime":"2025-11-22T04:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.508767 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e855881-4d77-4655-b4d7-a50fc081f993\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545a27e66130160ef1d8557458a64a27f18292c157e2e6dab9aa75aea0532ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35e9c8adb3bd9249f6d7e57cd40e40951af0463e49765ba635707120d07e8b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1538d20749062691aa2368004d22a46e612186aee24cb92acc3ddb073f616a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4a053080810e22083dda4eaba1155b7b547a214158f849f7e5778f2e37ccc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:49Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.528504 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b193b41e-aa0e-4816-b965-7b7873dadf85\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd4757f265f2b7a453efca645d83d5340e5ec206f6f9d40dd86010b90470498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1996517d6f55ae1765dd9d101fede2963e7ac51a406bca35cab95fa45192623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59408c7cd75594e068cdc4dadfec414fcc3d1604eea37ed708440fd1a4f019ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://516e9231111cee4a53c71bef07338222497c8ffb27edbfaddbcb2e58af61ae7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2097cbd81d5aedb02fafaae3f17840da75ab455e541c410ae2f70710548530ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:49Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.539553 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86ztb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d15248-9724-41b0-8370-66127cc18bbe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08e180e0857112708a5ca84fc45cd41b9aebc5eef5628d5666abc590d86242e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-799vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86ztb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:49Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.559103 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa68142f0ff1c2e1bd7c2534395b616a4b68c5e8dc9d16c6d10709b1ed3d8455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa68142f0ff1c2e1bd7c2534395b616a4b68c5e8dc9d16c6d10709b1ed3d8455\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:08:30Z\\\",\\\"message\\\":\\\" 6438 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1122 04:08:30.300232 6438 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1122 04:08:30.300243 6438 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1122 04:08:30.300257 6438 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1122 04:08:30.300270 6438 factory.go:656] Stopping watch factory\\\\nI1122 04:08:30.300285 6438 ovnkube.go:599] Stopped ovnkube\\\\nI1122 04:08:30.300330 6438 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 04:08:30.300345 6438 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 04:08:30.300351 6438 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1122 04:08:30.300356 6438 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1122 04:08:30.300361 6438 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1122 04:08:30.300366 6438 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1122 04:08:30.300372 6438 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1122 04:08:30.300377 6438 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1122 04:08:30.300383 6438 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1122 04:08:30.300391 6438 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1122 04:08:30.300476 6438 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:08:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-z7552_openshift-ovn-kubernetes(fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z7552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:49Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.571982 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gqt5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686f15a0-53ce-4d3f-80e2-7d6272dc7d4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5501c17b8d8e321c7b94254ed053f943531df548575931c4ec091997d68572a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxlj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cc1c0cd69753ab441348667255f1dc34d4eae5c0579a0f84eb5d6063f7970d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxlj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:08:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gqt5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:49Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.585930 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653394-4b4d-4c44-bc9d-39f2eeadbee4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e08c778826ca87eedf7169382d30509a5d31e132f5c91ff2cf633a24e3a7dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb226d8acfbc46b2a51a6c4ef5c04c1e17d99e9e82bad5950ccb4356fcc39eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c1d8b6512002b090f6fa191cc3dc7d55aeae6d135bca5df2c367fb2a4f68c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc4bf8d58b05d0044acc289a36a4eb6a4de51d5d0643239ff81fd7faff4531d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 04:07:50.127900 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 04:07:50.128059 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:07:50.128926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2923111326/tls.crt::/tmp/serving-cert-2923111326/tls.key\\\\\\\"\\\\nI1122 04:07:50.418529 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:07:50.432499 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:07:50.432593 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:07:50.432650 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:07:50.432686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:07:50.439773 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1122 04:07:50.439810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 04:07:50.439829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439834 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:07:50.439842 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:07:50.439844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:07:50.439864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:07:50.442112 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e25f8f28cc3aca76ae535aa6084bd1f994cbd0eb679f6ea40938a7fe456b0e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:49Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.598276 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:49Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.609028 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.609071 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.609091 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.609110 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.609122 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:49Z","lastTransitionTime":"2025-11-22T04:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.610726 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c43ee45b5065b7baee9b0025b5a73b4915b4577169a35be4378acf0e7cb603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:49Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.622535 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h6ndp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd066499-5bd5-459c-8a02-d02f716c8965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822e0ef5b78e9c1b19b56d52c7eed8ad0058cc30b405b2adf0e2a572afdaab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hhkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h6ndp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:49Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.635616 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bdbae2-706a-4f84-9f56-5a42aec77762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc56d58ec38fe2e6ff34afa44193fd165159799c6184b7f1474c8b13087f257f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191befb5ec1036276709a4720f3cd8c40d63d14818bed55c5fac998489233619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kjwnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:49Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.651805 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b7225" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e5e536a-6797-4e6f-8160-1e23ddda1647\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07e7b4e6ae273aa9999ce9d0f198b8a9317611f11ddb313258aed23e3feff339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b7225\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:49Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.664882 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fe0275b-9174-4aab-9f0f-7c00a233de69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3141a4a35fe91db661f1bbb69f481d1db9302e79a16e9bc2898f2fd5fbe0f445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e418cb4f331bd30b224110514a5d766e31fd949210ed6eb5ea3e1e04b2f62d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb5d783b1e21eb55efe9affd3962651d2bc2f2345954fa40a00e5f9b481066fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6860d3b5c86b1ad3bd55fc98a44e7fd84d66a5237df59f47319f598420b0241f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6860d3b5c86b1ad3bd55fc98a44e7fd84d66a5237df59f47319f598420b0241f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:49Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.682164 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c858c4eaa869f479d0fbd62eadd41218ca8dddc7ae5ffd82d36977acde2e76ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:49Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.697543 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:49Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.712681 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.712757 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.712769 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.712788 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.712798 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:49Z","lastTransitionTime":"2025-11-22T04:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.713490 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pmtb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3db78d8652d86af236e2b210210af39f3c90f31425810390e79391e581d0cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5af0f83551d8cf679ee04fbc3995afe66769f74480211fb104ebf2d6d0f9ab8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:08:39Z\\\",\\\"message\\\":\\\"2025-11-22T04:07:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_60dfe7ea-f5eb-4363-a49b-b3c5f3ab720c\\\\n2025-11-22T04:07:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_60dfe7ea-f5eb-4363-a49b-b3c5f3ab720c to /host/opt/cni/bin/\\\\n2025-11-22T04:07:54Z [verbose] multus-daemon started\\\\n2025-11-22T04:07:54Z [verbose] Readiness Indicator file check\\\\n2025-11-22T04:08:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccx9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pmtb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:49Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.815333 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.815431 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.815459 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.815484 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.815498 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:49Z","lastTransitionTime":"2025-11-22T04:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.919022 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.919070 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.919083 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.919101 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:49 crc kubenswrapper[4699]: I1122 04:08:49.919113 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:49Z","lastTransitionTime":"2025-11-22T04:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.021302 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.021347 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.021363 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.021383 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.021397 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:50Z","lastTransitionTime":"2025-11-22T04:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.125115 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.125210 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.125243 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.125281 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.125306 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:50Z","lastTransitionTime":"2025-11-22T04:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.228195 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.228272 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.228292 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.228322 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.228343 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:50Z","lastTransitionTime":"2025-11-22T04:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.331796 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.331833 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.331842 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.331856 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.331865 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:50Z","lastTransitionTime":"2025-11-22T04:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.433815 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.433880 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.433897 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.433923 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.433940 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:50Z","lastTransitionTime":"2025-11-22T04:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.447112 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:08:50 crc kubenswrapper[4699]: E1122 04:08:50.447281 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.447584 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:08:50 crc kubenswrapper[4699]: E1122 04:08:50.447678 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.447722 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:08:50 crc kubenswrapper[4699]: E1122 04:08:50.447886 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.536794 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.536831 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.536839 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.536858 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.536871 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:50Z","lastTransitionTime":"2025-11-22T04:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.639580 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.639622 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.639631 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.639645 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.639655 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:50Z","lastTransitionTime":"2025-11-22T04:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.742383 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.742497 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.742515 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.742539 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.742555 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:50Z","lastTransitionTime":"2025-11-22T04:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.845640 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.845701 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.845724 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.845753 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.845775 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:50Z","lastTransitionTime":"2025-11-22T04:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.948582 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.948648 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.948666 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.948691 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:50 crc kubenswrapper[4699]: I1122 04:08:50.948712 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:50Z","lastTransitionTime":"2025-11-22T04:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.052140 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.052233 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.052258 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.052291 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.052312 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:51Z","lastTransitionTime":"2025-11-22T04:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.155764 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.155840 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.155859 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.155886 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.155905 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:51Z","lastTransitionTime":"2025-11-22T04:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.259078 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.259127 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.259139 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.259161 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.259174 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:51Z","lastTransitionTime":"2025-11-22T04:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.361738 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.361817 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.361846 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.361887 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.361915 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:51Z","lastTransitionTime":"2025-11-22T04:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.447769 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:08:51 crc kubenswrapper[4699]: E1122 04:08:51.448050 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.464878 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.464942 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.464954 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.464975 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.464993 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:51Z","lastTransitionTime":"2025-11-22T04:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.567750 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.567813 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.567824 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.567844 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.567857 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:51Z","lastTransitionTime":"2025-11-22T04:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.670882 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.670933 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.670962 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.670984 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.670997 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:51Z","lastTransitionTime":"2025-11-22T04:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.780112 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.780202 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.780393 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.780518 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.780542 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:51Z","lastTransitionTime":"2025-11-22T04:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.884140 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.884214 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.884233 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.884261 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.884281 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:51Z","lastTransitionTime":"2025-11-22T04:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.987878 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.987965 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.987985 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.988012 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:51 crc kubenswrapper[4699]: I1122 04:08:51.988029 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:51Z","lastTransitionTime":"2025-11-22T04:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:52 crc kubenswrapper[4699]: I1122 04:08:52.091425 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:52 crc kubenswrapper[4699]: I1122 04:08:52.091524 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:52 crc kubenswrapper[4699]: I1122 04:08:52.091546 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:52 crc kubenswrapper[4699]: I1122 04:08:52.091575 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:52 crc kubenswrapper[4699]: I1122 04:08:52.091595 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:52Z","lastTransitionTime":"2025-11-22T04:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:52 crc kubenswrapper[4699]: I1122 04:08:52.195925 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:52 crc kubenswrapper[4699]: I1122 04:08:52.196001 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:52 crc kubenswrapper[4699]: I1122 04:08:52.196018 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:52 crc kubenswrapper[4699]: I1122 04:08:52.196393 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:52 crc kubenswrapper[4699]: I1122 04:08:52.196588 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:52Z","lastTransitionTime":"2025-11-22T04:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:52 crc kubenswrapper[4699]: I1122 04:08:52.299782 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:52 crc kubenswrapper[4699]: I1122 04:08:52.299873 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:52 crc kubenswrapper[4699]: I1122 04:08:52.299897 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:52 crc kubenswrapper[4699]: I1122 04:08:52.299927 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:52 crc kubenswrapper[4699]: I1122 04:08:52.299950 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:52Z","lastTransitionTime":"2025-11-22T04:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:52 crc kubenswrapper[4699]: I1122 04:08:52.402766 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:52 crc kubenswrapper[4699]: I1122 04:08:52.402816 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:52 crc kubenswrapper[4699]: I1122 04:08:52.402827 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:52 crc kubenswrapper[4699]: I1122 04:08:52.402845 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:52 crc kubenswrapper[4699]: I1122 04:08:52.402859 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:52Z","lastTransitionTime":"2025-11-22T04:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:52 crc kubenswrapper[4699]: I1122 04:08:52.447503 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:08:52 crc kubenswrapper[4699]: I1122 04:08:52.447571 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:08:52 crc kubenswrapper[4699]: I1122 04:08:52.447744 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:08:52 crc kubenswrapper[4699]: E1122 04:08:52.447880 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:08:52 crc kubenswrapper[4699]: E1122 04:08:52.448121 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:08:52 crc kubenswrapper[4699]: E1122 04:08:52.448245 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:08:52 crc kubenswrapper[4699]: I1122 04:08:52.463320 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Nov 22 04:08:52 crc kubenswrapper[4699]: I1122 04:08:52.506638 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:52 crc kubenswrapper[4699]: I1122 04:08:52.506696 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:52 crc kubenswrapper[4699]: I1122 04:08:52.506708 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:52 crc kubenswrapper[4699]: I1122 04:08:52.506731 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:52 crc kubenswrapper[4699]: I1122 04:08:52.506749 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:52Z","lastTransitionTime":"2025-11-22T04:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:52 crc kubenswrapper[4699]: I1122 04:08:52.609125 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:52 crc kubenswrapper[4699]: I1122 04:08:52.609168 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:52 crc kubenswrapper[4699]: I1122 04:08:52.609179 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:52 crc kubenswrapper[4699]: I1122 04:08:52.609196 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:52 crc kubenswrapper[4699]: I1122 04:08:52.609209 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:52Z","lastTransitionTime":"2025-11-22T04:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:52 crc kubenswrapper[4699]: I1122 04:08:52.712057 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:52 crc kubenswrapper[4699]: I1122 04:08:52.712144 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:52 crc kubenswrapper[4699]: I1122 04:08:52.712165 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:52 crc kubenswrapper[4699]: I1122 04:08:52.712232 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:52 crc kubenswrapper[4699]: I1122 04:08:52.712252 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:52Z","lastTransitionTime":"2025-11-22T04:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:52 crc kubenswrapper[4699]: I1122 04:08:52.815893 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:52 crc kubenswrapper[4699]: I1122 04:08:52.815934 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:52 crc kubenswrapper[4699]: I1122 04:08:52.815944 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:52 crc kubenswrapper[4699]: I1122 04:08:52.815964 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:52 crc kubenswrapper[4699]: I1122 04:08:52.815978 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:52Z","lastTransitionTime":"2025-11-22T04:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:52 crc kubenswrapper[4699]: I1122 04:08:52.918594 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:52 crc kubenswrapper[4699]: I1122 04:08:52.918642 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:52 crc kubenswrapper[4699]: I1122 04:08:52.918662 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:52 crc kubenswrapper[4699]: I1122 04:08:52.918689 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:52 crc kubenswrapper[4699]: I1122 04:08:52.918708 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:52Z","lastTransitionTime":"2025-11-22T04:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.022425 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.022542 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.022565 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.022604 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.022631 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:53Z","lastTransitionTime":"2025-11-22T04:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.126223 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.126292 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.126316 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.126345 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.126369 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:53Z","lastTransitionTime":"2025-11-22T04:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.229178 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.229595 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.229949 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.230091 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.230221 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:53Z","lastTransitionTime":"2025-11-22T04:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.333376 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.333443 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.333453 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.333471 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.333482 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:53Z","lastTransitionTime":"2025-11-22T04:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.437030 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.437078 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.437092 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.437111 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.437126 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:53Z","lastTransitionTime":"2025-11-22T04:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.447555 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:08:53 crc kubenswrapper[4699]: E1122 04:08:53.447715 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.541089 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.541177 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.541196 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.541247 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.541261 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:53Z","lastTransitionTime":"2025-11-22T04:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.644282 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.644350 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.644361 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.644380 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.644392 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:53Z","lastTransitionTime":"2025-11-22T04:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.747649 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.747715 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.747732 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.747759 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.747785 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:53Z","lastTransitionTime":"2025-11-22T04:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.851511 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.851580 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.851598 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.851624 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.851645 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:53Z","lastTransitionTime":"2025-11-22T04:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.954628 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.954681 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.954699 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.954719 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:53 crc kubenswrapper[4699]: I1122 04:08:53.954733 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:53Z","lastTransitionTime":"2025-11-22T04:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.057923 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.057997 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.058021 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.058047 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.058064 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:54Z","lastTransitionTime":"2025-11-22T04:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.160531 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.160670 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.160717 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.160754 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.160779 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:54Z","lastTransitionTime":"2025-11-22T04:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.271033 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.271107 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.271146 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.271178 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.271201 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:54Z","lastTransitionTime":"2025-11-22T04:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.323485 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.323674 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.323719 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:08:54 crc kubenswrapper[4699]: E1122 04:08:54.323816 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:09:58.323766968 +0000 UTC m=+149.666388155 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:08:54 crc kubenswrapper[4699]: E1122 04:08:54.323893 4699 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 04:08:54 crc kubenswrapper[4699]: E1122 04:08:54.323924 4699 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 04:08:54 crc kubenswrapper[4699]: E1122 04:08:54.323982 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 04:09:58.323962592 +0000 UTC m=+149.666583789 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 04:08:54 crc kubenswrapper[4699]: E1122 04:08:54.324034 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 04:09:58.324002013 +0000 UTC m=+149.666623230 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.375094 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.375166 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.375187 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.375217 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.375241 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:54Z","lastTransitionTime":"2025-11-22T04:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.425210 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.425276 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:08:54 crc kubenswrapper[4699]: E1122 04:08:54.425492 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 04:08:54 crc kubenswrapper[4699]: E1122 04:08:54.425543 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 04:08:54 crc kubenswrapper[4699]: E1122 04:08:54.425560 4699 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:08:54 crc kubenswrapper[4699]: E1122 04:08:54.425560 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 04:08:54 crc kubenswrapper[4699]: E1122 04:08:54.425597 4699 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 04:08:54 crc kubenswrapper[4699]: E1122 04:08:54.425619 4699 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:08:54 crc kubenswrapper[4699]: E1122 04:08:54.425647 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-22 04:09:58.425620613 +0000 UTC m=+149.768241800 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:08:54 crc kubenswrapper[4699]: E1122 04:08:54.425699 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-22 04:09:58.425674624 +0000 UTC m=+149.768295841 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.447578 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.447638 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.447594 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:08:54 crc kubenswrapper[4699]: E1122 04:08:54.447793 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:08:54 crc kubenswrapper[4699]: E1122 04:08:54.447947 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:08:54 crc kubenswrapper[4699]: E1122 04:08:54.448248 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.477829 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.477880 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.477891 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.477911 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.477925 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:54Z","lastTransitionTime":"2025-11-22T04:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.580964 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.581046 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.581058 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.581077 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.581088 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:54Z","lastTransitionTime":"2025-11-22T04:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.684374 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.684481 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.684508 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.684542 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.684566 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:54Z","lastTransitionTime":"2025-11-22T04:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.788160 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.788336 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.788351 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.788376 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.788392 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:54Z","lastTransitionTime":"2025-11-22T04:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.891473 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.891525 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.891539 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.891561 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.891573 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:54Z","lastTransitionTime":"2025-11-22T04:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.994828 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.994953 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.994968 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.994991 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:54 crc kubenswrapper[4699]: I1122 04:08:54.995007 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:54Z","lastTransitionTime":"2025-11-22T04:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:55 crc kubenswrapper[4699]: I1122 04:08:55.098569 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:55 crc kubenswrapper[4699]: I1122 04:08:55.098657 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:55 crc kubenswrapper[4699]: I1122 04:08:55.098678 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:55 crc kubenswrapper[4699]: I1122 04:08:55.098709 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:55 crc kubenswrapper[4699]: I1122 04:08:55.098728 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:55Z","lastTransitionTime":"2025-11-22T04:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:55 crc kubenswrapper[4699]: I1122 04:08:55.202081 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:55 crc kubenswrapper[4699]: I1122 04:08:55.202139 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:55 crc kubenswrapper[4699]: I1122 04:08:55.202149 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:55 crc kubenswrapper[4699]: I1122 04:08:55.202169 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:55 crc kubenswrapper[4699]: I1122 04:08:55.202181 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:55Z","lastTransitionTime":"2025-11-22T04:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:55 crc kubenswrapper[4699]: I1122 04:08:55.305412 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:55 crc kubenswrapper[4699]: I1122 04:08:55.305486 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:55 crc kubenswrapper[4699]: I1122 04:08:55.305497 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:55 crc kubenswrapper[4699]: I1122 04:08:55.305514 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:55 crc kubenswrapper[4699]: I1122 04:08:55.305524 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:55Z","lastTransitionTime":"2025-11-22T04:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:55 crc kubenswrapper[4699]: I1122 04:08:55.408742 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:55 crc kubenswrapper[4699]: I1122 04:08:55.408807 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:55 crc kubenswrapper[4699]: I1122 04:08:55.408818 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:55 crc kubenswrapper[4699]: I1122 04:08:55.408838 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:55 crc kubenswrapper[4699]: I1122 04:08:55.408854 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:55Z","lastTransitionTime":"2025-11-22T04:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:55 crc kubenswrapper[4699]: I1122 04:08:55.447250 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:08:55 crc kubenswrapper[4699]: E1122 04:08:55.447599 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:08:55 crc kubenswrapper[4699]: I1122 04:08:55.511357 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:55 crc kubenswrapper[4699]: I1122 04:08:55.511404 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:55 crc kubenswrapper[4699]: I1122 04:08:55.511452 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:55 crc kubenswrapper[4699]: I1122 04:08:55.511468 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:55 crc kubenswrapper[4699]: I1122 04:08:55.511481 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:55Z","lastTransitionTime":"2025-11-22T04:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:55 crc kubenswrapper[4699]: I1122 04:08:55.614055 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:55 crc kubenswrapper[4699]: I1122 04:08:55.614134 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:55 crc kubenswrapper[4699]: I1122 04:08:55.614145 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:55 crc kubenswrapper[4699]: I1122 04:08:55.614164 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:55 crc kubenswrapper[4699]: I1122 04:08:55.614195 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:55Z","lastTransitionTime":"2025-11-22T04:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:55 crc kubenswrapper[4699]: I1122 04:08:55.717466 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:55 crc kubenswrapper[4699]: I1122 04:08:55.717534 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:55 crc kubenswrapper[4699]: I1122 04:08:55.717551 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:55 crc kubenswrapper[4699]: I1122 04:08:55.717577 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:55 crc kubenswrapper[4699]: I1122 04:08:55.717594 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:55Z","lastTransitionTime":"2025-11-22T04:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:55 crc kubenswrapper[4699]: I1122 04:08:55.820023 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:55 crc kubenswrapper[4699]: I1122 04:08:55.820094 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:55 crc kubenswrapper[4699]: I1122 04:08:55.820116 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:55 crc kubenswrapper[4699]: I1122 04:08:55.820177 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:55 crc kubenswrapper[4699]: I1122 04:08:55.820195 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:55Z","lastTransitionTime":"2025-11-22T04:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:55 crc kubenswrapper[4699]: I1122 04:08:55.925219 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:55 crc kubenswrapper[4699]: I1122 04:08:55.925274 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:55 crc kubenswrapper[4699]: I1122 04:08:55.925286 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:55 crc kubenswrapper[4699]: I1122 04:08:55.925305 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:55 crc kubenswrapper[4699]: I1122 04:08:55.925320 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:55Z","lastTransitionTime":"2025-11-22T04:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.028544 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.028625 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.028686 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.028721 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.028746 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:56Z","lastTransitionTime":"2025-11-22T04:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.132382 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.132616 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.132642 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.132667 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.132684 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:56Z","lastTransitionTime":"2025-11-22T04:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.236272 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.236339 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.236361 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.236391 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.236411 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:56Z","lastTransitionTime":"2025-11-22T04:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.339497 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.339540 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.339548 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.339562 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.339574 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:56Z","lastTransitionTime":"2025-11-22T04:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.443077 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.443172 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.443190 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.443218 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.443234 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:56Z","lastTransitionTime":"2025-11-22T04:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.447551 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.447587 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.447619 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:08:56 crc kubenswrapper[4699]: E1122 04:08:56.447883 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:08:56 crc kubenswrapper[4699]: E1122 04:08:56.448017 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:08:56 crc kubenswrapper[4699]: E1122 04:08:56.448218 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.547252 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.547312 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.547328 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.547352 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.547369 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:56Z","lastTransitionTime":"2025-11-22T04:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.650743 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.650808 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.650826 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.650853 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.650872 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:56Z","lastTransitionTime":"2025-11-22T04:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.753944 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.754015 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.754039 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.754069 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.754092 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:56Z","lastTransitionTime":"2025-11-22T04:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.860547 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.860600 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.860611 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.860627 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.860641 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:56Z","lastTransitionTime":"2025-11-22T04:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.963758 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.963815 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.963837 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.963885 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:56 crc kubenswrapper[4699]: I1122 04:08:56.963909 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:56Z","lastTransitionTime":"2025-11-22T04:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.066120 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.066199 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.066217 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.066245 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.066264 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:57Z","lastTransitionTime":"2025-11-22T04:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.168425 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.168481 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.168491 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.168507 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.168516 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:57Z","lastTransitionTime":"2025-11-22T04:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.271212 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.271295 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.271319 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.271346 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.271364 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:57Z","lastTransitionTime":"2025-11-22T04:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.374675 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.374738 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.374754 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.374778 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.374795 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:57Z","lastTransitionTime":"2025-11-22T04:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.447529 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:08:57 crc kubenswrapper[4699]: E1122 04:08:57.447735 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.477798 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.477847 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.477863 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.477885 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.477903 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:57Z","lastTransitionTime":"2025-11-22T04:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.581129 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.581203 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.581227 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.581260 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.581282 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:57Z","lastTransitionTime":"2025-11-22T04:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.684429 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.684551 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.684573 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.684597 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.684614 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:57Z","lastTransitionTime":"2025-11-22T04:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.787790 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.787841 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.787861 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.787884 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.787900 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:57Z","lastTransitionTime":"2025-11-22T04:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.891566 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.891628 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.891646 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.891673 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.891691 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:57Z","lastTransitionTime":"2025-11-22T04:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.994559 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.994659 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.994688 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.994723 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:57 crc kubenswrapper[4699]: I1122 04:08:57.994748 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:57Z","lastTransitionTime":"2025-11-22T04:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.097514 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.097590 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.097607 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.097632 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.097649 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:58Z","lastTransitionTime":"2025-11-22T04:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.178399 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.178474 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.178484 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.178507 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.178522 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:58Z","lastTransitionTime":"2025-11-22T04:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:58 crc kubenswrapper[4699]: E1122 04:08:58.199163 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4852b328-c4f8-4280-9881-83927c94bf9a\\\",\\\"systemUUID\\\":\\\"76c96961-7d99-459e-9731-5ae805318244\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:58Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.204379 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.204422 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.204455 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.204478 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.204492 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:58Z","lastTransitionTime":"2025-11-22T04:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:58 crc kubenswrapper[4699]: E1122 04:08:58.226878 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4852b328-c4f8-4280-9881-83927c94bf9a\\\",\\\"systemUUID\\\":\\\"76c96961-7d99-459e-9731-5ae805318244\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:58Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.234273 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.234351 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.234379 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.234413 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.234494 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:58Z","lastTransitionTime":"2025-11-22T04:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:58 crc kubenswrapper[4699]: E1122 04:08:58.257404 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4852b328-c4f8-4280-9881-83927c94bf9a\\\",\\\"systemUUID\\\":\\\"76c96961-7d99-459e-9731-5ae805318244\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:58Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.263575 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.263630 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.263644 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.263664 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.263677 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:58Z","lastTransitionTime":"2025-11-22T04:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:58 crc kubenswrapper[4699]: E1122 04:08:58.282153 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4852b328-c4f8-4280-9881-83927c94bf9a\\\",\\\"systemUUID\\\":\\\"76c96961-7d99-459e-9731-5ae805318244\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:58Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.286290 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.286381 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.286404 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.286463 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.286483 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:58Z","lastTransitionTime":"2025-11-22T04:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:58 crc kubenswrapper[4699]: E1122 04:08:58.302485 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:08:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4852b328-c4f8-4280-9881-83927c94bf9a\\\",\\\"systemUUID\\\":\\\"76c96961-7d99-459e-9731-5ae805318244\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:58Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:58 crc kubenswrapper[4699]: E1122 04:08:58.302710 4699 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.305172 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.305227 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.305244 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.305271 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.305289 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:58Z","lastTransitionTime":"2025-11-22T04:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.407701 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.407765 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.407776 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.407793 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.407804 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:58Z","lastTransitionTime":"2025-11-22T04:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.447489 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.447489 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:08:58 crc kubenswrapper[4699]: E1122 04:08:58.447641 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.447509 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:08:58 crc kubenswrapper[4699]: E1122 04:08:58.447701 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:08:58 crc kubenswrapper[4699]: E1122 04:08:58.447765 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.511370 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.511474 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.511501 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.511532 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.511556 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:58Z","lastTransitionTime":"2025-11-22T04:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.621202 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.621287 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.621304 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.621371 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.621387 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:58Z","lastTransitionTime":"2025-11-22T04:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.724642 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.724701 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.724719 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.724744 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.724763 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:58Z","lastTransitionTime":"2025-11-22T04:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.827956 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.828018 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.828040 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.828070 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.828091 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:58Z","lastTransitionTime":"2025-11-22T04:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.931653 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.931717 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.931741 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.931772 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:58 crc kubenswrapper[4699]: I1122 04:08:58.931793 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:58Z","lastTransitionTime":"2025-11-22T04:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.036195 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.036258 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.036275 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.036299 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.036316 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:59Z","lastTransitionTime":"2025-11-22T04:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.139874 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.139986 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.140007 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.140033 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.140052 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:59Z","lastTransitionTime":"2025-11-22T04:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.244252 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.244338 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.244358 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.244390 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.244411 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:59Z","lastTransitionTime":"2025-11-22T04:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.347337 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.347412 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.347464 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.347492 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.347513 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:59Z","lastTransitionTime":"2025-11-22T04:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.447698 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:08:59 crc kubenswrapper[4699]: E1122 04:08:59.447919 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.451534 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.451609 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.451628 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.451656 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.451676 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:59Z","lastTransitionTime":"2025-11-22T04:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.466788 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h6ndp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd066499-5bd5-459c-8a02-d02f716c8965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822e0ef5b78e9c1b19b56d52c7eed8ad0058cc30b405b2adf0e2a572afdaab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hhkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h6ndp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.488997 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bdbae2-706a-4f84-9f56-5a42aec77762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc56d58ec38fe2e6ff34afa44193fd165159799c6184b7f1474c8b13087f257f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191befb5ec1036276709a4720f3cd8c40d63d14818bed55c5fac998489233619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kjwnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.513111 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b7225" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e5e536a-6797-4e6f-8160-1e23ddda1647\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07e7b4e6ae273aa9999ce9d0f198b8a9317611f11ddb313258aed23e3feff339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b7225\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.537593 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653394-4b4d-4c44-bc9d-39f2eeadbee4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e08c778826ca87eedf7169382d30509a5d31e132f5c91ff2cf633a24e3a7dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb226d8acfbc46b2a51a6c4ef5c04c1e17d99e9e82bad5950ccb4356fcc39eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c1d8b6512002b090f6fa191cc3dc7d55aeae6d135bca5df2c367fb2a4f68c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc4bf8d58b05d0044acc289a36a4eb6a4de51d5d0643239ff81fd7faff4531d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 04:07:50.127900 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 04:07:50.128059 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:07:50.128926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2923111326/tls.crt::/tmp/serving-cert-2923111326/tls.key\\\\\\\"\\\\nI1122 04:07:50.418529 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:07:50.432499 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:07:50.432593 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:07:50.432650 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:07:50.432686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:07:50.439773 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1122 04:07:50.439810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 04:07:50.439829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439834 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:07:50.439842 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:07:50.439844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:07:50.439864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:07:50.442112 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e25f8f28cc3aca76ae535aa6084bd1f994cbd0eb679f6ea40938a7fe456b0e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.555536 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.555596 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.556674 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.556750 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.556770 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:59Z","lastTransitionTime":"2025-11-22T04:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.556917 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32845a6f-f693-4d06-89a5-b35cd75685f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74d319bf4380d67d100b93621956d84606b64e3c4fe494e61dc658a4300bf124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09d810e4209cf7eab6c6fbb0fedd46d64aee5d2b38b710e5bf19daa5515133f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09d810e4209cf7eab6c6fbb0fedd46d64aee5d2b38b710e5bf19daa5515133f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.577523 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.595814 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c43ee45b5065b7baee9b0025b5a73b4915b4577169a35be4378acf0e7cb603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.613636 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fe0275b-9174-4aab-9f0f-7c00a233de69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3141a4a35fe91db661f1bbb69f481d1db9302e79a16e9bc2898f2fd5fbe0f445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e418cb4f331bd30b224110514a5d766e31fd949210ed6eb5ea3e1e04b2f62d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb5d783b1e21eb55efe9affd3962651d2bc2f2345954fa40a00e5f9b481066fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6860d3b5c86b1ad3bd55fc98a44e7fd84d66a5237df59f47319f598420b0241f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6860d3b5c86b1ad3bd55fc98a44e7fd84d66a5237df59f47319f598420b0241f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.631258 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c858c4eaa869f479d0fbd62eadd41218ca8dddc7ae5ffd82d36977acde2e76ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.649830 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.660230 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.660271 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.660283 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.660303 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.660318 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:59Z","lastTransitionTime":"2025-11-22T04:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.667153 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pmtb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3db78d8652d86af236e2b210210af39f3c90f31425810390e79391e581d0cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5af0f83551d8cf679ee04fbc3995afe66769f74480211fb104ebf2d6d0f9ab8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:08:39Z\\\",\\\"message\\\":\\\"2025-11-22T04:07:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_60dfe7ea-f5eb-4363-a49b-b3c5f3ab720c\\\\n2025-11-22T04:07:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_60dfe7ea-f5eb-4363-a49b-b3c5f3ab720c to /host/opt/cni/bin/\\\\n2025-11-22T04:07:54Z [verbose] multus-daemon started\\\\n2025-11-22T04:07:54Z [verbose] Readiness Indicator file check\\\\n2025-11-22T04:08:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccx9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pmtb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.686102 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.702574 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bfcbb63b703f8f023d54028af9011b37da8d2f7c9ac57e35129cd783f301876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99bfafe09aabfb9e3715d3c7af12849e0c8cb66e5799011c8463c5043383fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.718237 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pj52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82be5d0c-6f95-43e4-aa3c-9c56de3e200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:08:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pj52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.732426 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gqt5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686f15a0-53ce-4d3f-80e2-7d6272dc7d4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5501c17b8d8e321c7b94254ed053f943531df548575931c4ec091997d68572a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxlj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cc1c0cd69753ab441348667255f1dc34d4eae5c0579a0f84eb5d6063f7970d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxlj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:08:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gqt5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.754401 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e855881-4d77-4655-b4d7-a50fc081f993\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545a27e66130160ef1d8557458a64a27f18292c157e2e6dab9aa75aea0532ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35e9c8adb3bd9249f6d7e57cd40e40951af0463e49765ba635707120d07e8b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1538d20749062691aa2368004d22a46e612186aee24cb92acc3ddb073f616a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4a053080810e22083dda4eaba1155b7b547a214158f849f7e5778f2e37ccc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.763781 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.763834 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.763853 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.763879 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.763898 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:59Z","lastTransitionTime":"2025-11-22T04:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.789857 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b193b41e-aa0e-4816-b965-7b7873dadf85\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd4757f265f2b7a453efca645d83d5340e5ec206f6f9d40dd86010b90470498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1996517d6f55ae1765dd9d101fede2963e7ac51a406bca35cab95fa45192623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59408c7cd75594e068cdc4dadfec414fcc3d1604eea37ed708440fd1a4f019ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://516e9231111cee4a53c71bef07338222497c8ffb27edbfaddbcb2e58af61ae7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2097cbd81d5aedb02fafaae3f17840da75ab455e541c410ae2f70710548530ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.806511 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86ztb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d15248-9724-41b0-8370-66127cc18bbe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08e180e0857112708a5ca84fc45cd41b9aebc5eef5628d5666abc590d86242e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-799vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86ztb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.830754 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa68142f0ff1c2e1bd7c2534395b616a4b68c5e8dc9d16c6d10709b1ed3d8455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa68142f0ff1c2e1bd7c2534395b616a4b68c5e8dc9d16c6d10709b1ed3d8455\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:08:30Z\\\",\\\"message\\\":\\\" 6438 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1122 04:08:30.300232 6438 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1122 04:08:30.300243 6438 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1122 04:08:30.300257 6438 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1122 04:08:30.300270 6438 factory.go:656] Stopping watch factory\\\\nI1122 04:08:30.300285 6438 ovnkube.go:599] Stopped ovnkube\\\\nI1122 04:08:30.300330 6438 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 04:08:30.300345 6438 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 04:08:30.300351 6438 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1122 04:08:30.300356 6438 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1122 04:08:30.300361 6438 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1122 04:08:30.300366 6438 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1122 04:08:30.300372 6438 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1122 04:08:30.300377 6438 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1122 04:08:30.300383 6438 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1122 04:08:30.300391 6438 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1122 04:08:30.300476 6438 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:08:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-z7552_openshift-ovn-kubernetes(fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z7552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:08:59Z is after 2025-08-24T17:21:41Z" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.866355 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.866396 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.866408 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.866443 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.866454 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:59Z","lastTransitionTime":"2025-11-22T04:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.968665 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.968919 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.968945 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.968977 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:08:59 crc kubenswrapper[4699]: I1122 04:08:59.969000 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:08:59Z","lastTransitionTime":"2025-11-22T04:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:00 crc kubenswrapper[4699]: I1122 04:09:00.074076 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:00 crc kubenswrapper[4699]: I1122 04:09:00.074143 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:00 crc kubenswrapper[4699]: I1122 04:09:00.074168 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:00 crc kubenswrapper[4699]: I1122 04:09:00.074356 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:00 crc kubenswrapper[4699]: I1122 04:09:00.074485 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:00Z","lastTransitionTime":"2025-11-22T04:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:00 crc kubenswrapper[4699]: I1122 04:09:00.177151 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:00 crc kubenswrapper[4699]: I1122 04:09:00.177205 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:00 crc kubenswrapper[4699]: I1122 04:09:00.177216 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:00 crc kubenswrapper[4699]: I1122 04:09:00.177234 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:00 crc kubenswrapper[4699]: I1122 04:09:00.177245 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:00Z","lastTransitionTime":"2025-11-22T04:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:00 crc kubenswrapper[4699]: I1122 04:09:00.280381 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:00 crc kubenswrapper[4699]: I1122 04:09:00.280505 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:00 crc kubenswrapper[4699]: I1122 04:09:00.280526 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:00 crc kubenswrapper[4699]: I1122 04:09:00.280553 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:00 crc kubenswrapper[4699]: I1122 04:09:00.280572 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:00Z","lastTransitionTime":"2025-11-22T04:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:00 crc kubenswrapper[4699]: I1122 04:09:00.383022 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:00 crc kubenswrapper[4699]: I1122 04:09:00.383075 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:00 crc kubenswrapper[4699]: I1122 04:09:00.383087 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:00 crc kubenswrapper[4699]: I1122 04:09:00.383105 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:00 crc kubenswrapper[4699]: I1122 04:09:00.383118 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:00Z","lastTransitionTime":"2025-11-22T04:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:00 crc kubenswrapper[4699]: I1122 04:09:00.447366 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:09:00 crc kubenswrapper[4699]: I1122 04:09:00.447425 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:09:00 crc kubenswrapper[4699]: I1122 04:09:00.447370 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:09:00 crc kubenswrapper[4699]: E1122 04:09:00.447638 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:09:00 crc kubenswrapper[4699]: E1122 04:09:00.447757 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:09:00 crc kubenswrapper[4699]: E1122 04:09:00.448201 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:09:00 crc kubenswrapper[4699]: I1122 04:09:00.485637 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:00 crc kubenswrapper[4699]: I1122 04:09:00.485695 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:00 crc kubenswrapper[4699]: I1122 04:09:00.485722 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:00 crc kubenswrapper[4699]: I1122 04:09:00.485752 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:00 crc kubenswrapper[4699]: I1122 04:09:00.485774 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:00Z","lastTransitionTime":"2025-11-22T04:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:00 crc kubenswrapper[4699]: I1122 04:09:00.589045 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:00 crc kubenswrapper[4699]: I1122 04:09:00.589104 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:00 crc kubenswrapper[4699]: I1122 04:09:00.589122 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:00 crc kubenswrapper[4699]: I1122 04:09:00.589146 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:00 crc kubenswrapper[4699]: I1122 04:09:00.589163 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:00Z","lastTransitionTime":"2025-11-22T04:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:00 crc kubenswrapper[4699]: I1122 04:09:00.693003 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:00 crc kubenswrapper[4699]: I1122 04:09:00.693076 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:00 crc kubenswrapper[4699]: I1122 04:09:00.693102 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:00 crc kubenswrapper[4699]: I1122 04:09:00.693134 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:00 crc kubenswrapper[4699]: I1122 04:09:00.693156 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:00Z","lastTransitionTime":"2025-11-22T04:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:00 crc kubenswrapper[4699]: I1122 04:09:00.796609 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:00 crc kubenswrapper[4699]: I1122 04:09:00.796694 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:00 crc kubenswrapper[4699]: I1122 04:09:00.796712 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:00 crc kubenswrapper[4699]: I1122 04:09:00.796742 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:00 crc kubenswrapper[4699]: I1122 04:09:00.796762 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:00Z","lastTransitionTime":"2025-11-22T04:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:00 crc kubenswrapper[4699]: I1122 04:09:00.901081 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:00 crc kubenswrapper[4699]: I1122 04:09:00.901158 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:00 crc kubenswrapper[4699]: I1122 04:09:00.901176 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:00 crc kubenswrapper[4699]: I1122 04:09:00.901200 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:00 crc kubenswrapper[4699]: I1122 04:09:00.901217 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:00Z","lastTransitionTime":"2025-11-22T04:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.004548 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.004646 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.004664 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.004691 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.004710 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:01Z","lastTransitionTime":"2025-11-22T04:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.106826 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.106868 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.106878 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.106896 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.106910 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:01Z","lastTransitionTime":"2025-11-22T04:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.210708 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.210767 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.210781 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.210801 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.210815 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:01Z","lastTransitionTime":"2025-11-22T04:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.314050 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.314122 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.314145 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.314182 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.314204 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:01Z","lastTransitionTime":"2025-11-22T04:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.417751 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.417838 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.417862 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.417894 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.417914 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:01Z","lastTransitionTime":"2025-11-22T04:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.447965 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:09:01 crc kubenswrapper[4699]: E1122 04:09:01.449078 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.449292 4699 scope.go:117] "RemoveContainer" containerID="aa68142f0ff1c2e1bd7c2534395b616a4b68c5e8dc9d16c6d10709b1ed3d8455" Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.521232 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.521291 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.521312 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.521339 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.521358 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:01Z","lastTransitionTime":"2025-11-22T04:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.624981 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.625090 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.625101 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.625120 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.625132 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:01Z","lastTransitionTime":"2025-11-22T04:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.735256 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.735304 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.735317 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.735337 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.735350 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:01Z","lastTransitionTime":"2025-11-22T04:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.839237 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.839303 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.839322 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.839348 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.839368 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:01Z","lastTransitionTime":"2025-11-22T04:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.951052 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.951097 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.951109 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.951125 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:01 crc kubenswrapper[4699]: I1122 04:09:01.951138 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:01Z","lastTransitionTime":"2025-11-22T04:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.053414 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.053486 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.053496 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.053511 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.053520 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:02Z","lastTransitionTime":"2025-11-22T04:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.084026 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z7552_fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3/ovnkube-controller/2.log" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.086336 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" event={"ID":"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3","Type":"ContainerStarted","Data":"bbb779ff19249c1428629a088a765868d3740d2e2ebbac18bdd170537da92af0"} Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.087340 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.098787 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h6ndp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd066499-5bd5-459c-8a02-d02f716c8965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822e0ef5b78e9c1b19b56d52c7eed8ad0058cc30b405b2adf0e2a572afdaab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hhkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h6ndp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.110719 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bdbae2-706a-4f84-9f56-5a42aec77762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc56d58ec38fe2e6ff34afa44193fd165159799c6184b7f1474c8b13087f257f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191befb5ec1036276709a4720f3cd8c40d63d14818bed55c5fac998489233619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kjwnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.132601 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b7225" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e5e536a-6797-4e6f-8160-1e23ddda1647\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07e7b4e6ae273aa9999ce9d0f198b8a9317611f11ddb313258aed23e3feff339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b7225\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.156348 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.156382 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.156390 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.156407 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.156417 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:02Z","lastTransitionTime":"2025-11-22T04:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.156599 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653394-4b4d-4c44-bc9d-39f2eeadbee4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e08c778826ca87eedf7169382d30509a5d31e132f5c91ff2cf633a24e3a7dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb226d8acfbc46b2a51a6c4ef5c04c1e17d99e9e82bad5950ccb4356fcc39eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c1d8b6512002b090f6fa191cc3dc7d55aeae6d135bca5df2c367fb2a4f68c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc4bf8d58b05d0044acc289a36a4eb6a4de51d5d0643239ff81fd7faff4531d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 04:07:50.127900 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 04:07:50.128059 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:07:50.128926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2923111326/tls.crt::/tmp/serving-cert-2923111326/tls.key\\\\\\\"\\\\nI1122 04:07:50.418529 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:07:50.432499 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:07:50.432593 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:07:50.432650 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:07:50.432686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:07:50.439773 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1122 04:07:50.439810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 04:07:50.439829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439834 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:07:50.439842 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:07:50.439844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:07:50.439864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:07:50.442112 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e25f8f28cc3aca76ae535aa6084bd1f994cbd0eb679f6ea40938a7fe456b0e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.167974 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32845a6f-f693-4d06-89a5-b35cd75685f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74d319bf4380d67d100b93621956d84606b64e3c4fe494e61dc658a4300bf124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09d810e4209cf7eab6c6fbb0fedd46d64aee5d2b38b710e5bf19daa5515133f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09d810e4209cf7eab6c6fbb0fedd46d64aee5d2b38b710e5bf19daa5515133f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.179986 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.190345 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c43ee45b5065b7baee9b0025b5a73b4915b4577169a35be4378acf0e7cb603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.203299 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fe0275b-9174-4aab-9f0f-7c00a233de69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3141a4a35fe91db661f1bbb69f481d1db9302e79a16e9bc2898f2fd5fbe0f445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e418cb4f331bd30b224110514a5d766e31fd949210ed6eb5ea3e1e04b2f62d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb5d783b1e21eb55efe9affd3962651d2bc2f2345954fa40a00e5f9b481066fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6860d3b5c86b1ad3bd55fc98a44e7fd84d66a5237df59f47319f598420b0241f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6860d3b5c86b1ad3bd55fc98a44e7fd84d66a5237df59f47319f598420b0241f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.216580 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c858c4eaa869f479d0fbd62eadd41218ca8dddc7ae5ffd82d36977acde2e76ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.227887 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.241452 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pmtb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3db78d8652d86af236e2b210210af39f3c90f31425810390e79391e581d0cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5af0f83551d8cf679ee04fbc3995afe66769f74480211fb104ebf2d6d0f9ab8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:08:39Z\\\",\\\"message\\\":\\\"2025-11-22T04:07:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_60dfe7ea-f5eb-4363-a49b-b3c5f3ab720c\\\\n2025-11-22T04:07:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_60dfe7ea-f5eb-4363-a49b-b3c5f3ab720c to /host/opt/cni/bin/\\\\n2025-11-22T04:07:54Z [verbose] multus-daemon started\\\\n2025-11-22T04:07:54Z [verbose] Readiness Indicator file check\\\\n2025-11-22T04:08:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccx9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pmtb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.256345 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.259139 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.259174 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.259183 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.259200 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.259209 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:02Z","lastTransitionTime":"2025-11-22T04:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.270894 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bfcbb63b703f8f023d54028af9011b37da8d2f7c9ac57e35129cd783f301876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99bfafe09aabfb9e3715d3c7af12849e0c8cb66e5799011c8463c5043383fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.282767 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pj52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82be5d0c-6f95-43e4-aa3c-9c56de3e200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:08:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pj52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.297343 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gqt5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686f15a0-53ce-4d3f-80e2-7d6272dc7d4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5501c17b8d8e321c7b94254ed053f943531df548575931c4ec091997d68572a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxlj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cc1c0cd69753ab441348667255f1dc34d4eae5c0579a0f84eb5d6063f7970d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxlj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:08:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gqt5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.312770 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e855881-4d77-4655-b4d7-a50fc081f993\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545a27e66130160ef1d8557458a64a27f18292c157e2e6dab9aa75aea0532ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35e9c8adb3bd9249f6d7e57cd40e40951af0463e49765ba635707120d07e8b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1538d20749062691aa2368004d22a46e612186aee24cb92acc3ddb073f616a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4a053080810e22083dda4eaba1155b7b547a214158f849f7e5778f2e37ccc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.333821 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b193b41e-aa0e-4816-b965-7b7873dadf85\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd4757f265f2b7a453efca645d83d5340e5ec206f6f9d40dd86010b90470498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1996517d6f55ae1765dd9d101fede2963e7ac51a406bca35cab95fa45192623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59408c7cd75594e068cdc4dadfec414fcc3d1604eea37ed708440fd1a4f019ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://516e9231111cee4a53c71bef07338222497c8ffb27edbfaddbcb2e58af61ae7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2097cbd81d5aedb02fafaae3f17840da75ab455e541c410ae2f70710548530ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.344806 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86ztb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d15248-9724-41b0-8370-66127cc18bbe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08e180e0857112708a5ca84fc45cd41b9aebc5eef5628d5666abc590d86242e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-799vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86ztb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.361692 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.361743 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.361754 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.361775 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.361794 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:02Z","lastTransitionTime":"2025-11-22T04:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.365470 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbb779ff19249c1428629a088a765868d3740d2e2ebbac18bdd170537da92af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa68142f0ff1c2e1bd7c2534395b616a4b68c5e8dc9d16c6d10709b1ed3d8455\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:08:30Z\\\",\\\"message\\\":\\\" 6438 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1122 04:08:30.300232 6438 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1122 04:08:30.300243 6438 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1122 04:08:30.300257 6438 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1122 04:08:30.300270 6438 factory.go:656] Stopping watch factory\\\\nI1122 04:08:30.300285 6438 ovnkube.go:599] Stopped ovnkube\\\\nI1122 04:08:30.300330 6438 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 04:08:30.300345 6438 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 04:08:30.300351 6438 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1122 04:08:30.300356 6438 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1122 04:08:30.300361 6438 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1122 04:08:30.300366 6438 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1122 04:08:30.300372 6438 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1122 04:08:30.300377 6438 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1122 04:08:30.300383 6438 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1122 04:08:30.300391 6438 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1122 04:08:30.300476 6438 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:08:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:09:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z7552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:02Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.447587 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.447664 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:09:02 crc kubenswrapper[4699]: E1122 04:09:02.447718 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.447596 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:09:02 crc kubenswrapper[4699]: E1122 04:09:02.447918 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:09:02 crc kubenswrapper[4699]: E1122 04:09:02.448082 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.464874 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.464928 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.464984 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.465019 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.465046 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:02Z","lastTransitionTime":"2025-11-22T04:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.568639 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.568701 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.568725 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.568751 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.568772 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:02Z","lastTransitionTime":"2025-11-22T04:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.672327 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.672401 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.672419 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.672482 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.672504 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:02Z","lastTransitionTime":"2025-11-22T04:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.775906 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.775997 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.776020 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.776050 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.776071 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:02Z","lastTransitionTime":"2025-11-22T04:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.879461 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.879529 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.879544 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.879567 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.879585 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:02Z","lastTransitionTime":"2025-11-22T04:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.983372 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.983455 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.983468 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.983486 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:02 crc kubenswrapper[4699]: I1122 04:09:02.983499 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:02Z","lastTransitionTime":"2025-11-22T04:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.086521 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.087033 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.087065 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.087089 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.087106 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:03Z","lastTransitionTime":"2025-11-22T04:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.092603 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z7552_fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3/ovnkube-controller/3.log" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.093687 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z7552_fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3/ovnkube-controller/2.log" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.098253 4699 generic.go:334] "Generic (PLEG): container finished" podID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerID="bbb779ff19249c1428629a088a765868d3740d2e2ebbac18bdd170537da92af0" exitCode=1 Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.098312 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" event={"ID":"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3","Type":"ContainerDied","Data":"bbb779ff19249c1428629a088a765868d3740d2e2ebbac18bdd170537da92af0"} Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.098365 4699 scope.go:117] "RemoveContainer" containerID="aa68142f0ff1c2e1bd7c2534395b616a4b68c5e8dc9d16c6d10709b1ed3d8455" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.099705 4699 scope.go:117] "RemoveContainer" containerID="bbb779ff19249c1428629a088a765868d3740d2e2ebbac18bdd170537da92af0" Nov 22 04:09:03 crc kubenswrapper[4699]: E1122 04:09:03.109043 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-z7552_openshift-ovn-kubernetes(fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.124197 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:03Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.147896 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bfcbb63b703f8f023d54028af9011b37da8d2f7c9ac57e35129cd783f301876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99bfafe09aabfb9e3715d3c7af12849e0c8cb66e5799011c8463c5043383fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:03Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.166282 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pj52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82be5d0c-6f95-43e4-aa3c-9c56de3e200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:08:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pj52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:03Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.184912 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gqt5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686f15a0-53ce-4d3f-80e2-7d6272dc7d4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5501c17b8d8e321c7b94254ed053f943531df548575931c4ec091997d68572a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxlj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cc1c0cd69753ab441348667255f1dc34d4eae5c0579a0f84eb5d6063f7970d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxlj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:08:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gqt5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:03Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.190664 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.190694 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.190706 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.190724 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.190738 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:03Z","lastTransitionTime":"2025-11-22T04:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.209309 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e855881-4d77-4655-b4d7-a50fc081f993\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545a27e66130160ef1d8557458a64a27f18292c157e2e6dab9aa75aea0532ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35e9c8adb3bd9249f6d7e57cd40e40951af0463e49765ba635707120d07e8b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1538d20749062691aa2368004d22a46e612186aee24cb92acc3ddb073f616a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4a053080810e22083dda4eaba1155b7b547a214158f849f7e5778f2e37ccc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:03Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.244029 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b193b41e-aa0e-4816-b965-7b7873dadf85\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd4757f265f2b7a453efca645d83d5340e5ec206f6f9d40dd86010b90470498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1996517d6f55ae1765dd9d101fede2963e7ac51a406bca35cab95fa45192623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59408c7cd75594e068cdc4dadfec414fcc3d1604eea37ed708440fd1a4f019ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://516e9231111cee4a53c71bef07338222497c8ffb27edbfaddbcb2e58af61ae7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2097cbd81d5aedb02fafaae3f17840da75ab455e541c410ae2f70710548530ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:03Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.261388 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86ztb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d15248-9724-41b0-8370-66127cc18bbe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08e180e0857112708a5ca84fc45cd41b9aebc5eef5628d5666abc590d86242e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-799vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86ztb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:03Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.292820 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbb779ff19249c1428629a088a765868d3740d2e2ebbac18bdd170537da92af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa68142f0ff1c2e1bd7c2534395b616a4b68c5e8dc9d16c6d10709b1ed3d8455\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:08:30Z\\\",\\\"message\\\":\\\" 6438 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1122 04:08:30.300232 6438 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1122 04:08:30.300243 6438 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1122 04:08:30.300257 6438 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1122 04:08:30.300270 6438 factory.go:656] Stopping watch factory\\\\nI1122 04:08:30.300285 6438 ovnkube.go:599] Stopped ovnkube\\\\nI1122 04:08:30.300330 6438 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 04:08:30.300345 6438 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 04:08:30.300351 6438 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1122 04:08:30.300356 6438 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1122 04:08:30.300361 6438 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1122 04:08:30.300366 6438 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1122 04:08:30.300372 6438 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1122 04:08:30.300377 6438 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1122 04:08:30.300383 6438 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1122 04:08:30.300391 6438 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1122 04:08:30.300476 6438 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:08:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbb779ff19249c1428629a088a765868d3740d2e2ebbac18bdd170537da92af0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:09:02Z\\\",\\\"message\\\":\\\" openshift-multus/network-metrics-daemon-pj52w\\\\nI1122 04:09:02.370975 6848 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}\\\\nI1122 04:09:02.373082 6848 services_controller.go:360] Finished syncing service dns-default on namespace openshift-dns for network=default : 4.804667ms\\\\nI1122 04:09:02.373107 6848 services_controller.go:356] Processing sync for service openshift-network-operator/metrics for network=default\\\\nI1122 04:09:02.373071 6848 services_controller.go:443] Built service openshift-machine-config-operator/machine-config-operator LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.183\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9001, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1122 04:09:02.373150 6848 services_controller.go:444] Built service openshift-machine-config-operator/machine-config-operator LB per-node configs for network=default: []services.lbConfig(nil)\\\\nF1122 04:09:02.372977 6848 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:09:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z7552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:03Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.294555 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.294614 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.294630 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.294655 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.294673 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:03Z","lastTransitionTime":"2025-11-22T04:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.311589 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h6ndp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd066499-5bd5-459c-8a02-d02f716c8965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822e0ef5b78e9c1b19b56d52c7eed8ad0058cc30b405b2adf0e2a572afdaab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hhkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h6ndp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:03Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.331938 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bdbae2-706a-4f84-9f56-5a42aec77762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc56d58ec38fe2e6ff34afa44193fd165159799c6184b7f1474c8b13087f257f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191befb5ec1036276709a4720f3cd8c40d63d14818bed55c5fac998489233619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kjwnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:03Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.355223 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b7225" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e5e536a-6797-4e6f-8160-1e23ddda1647\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07e7b4e6ae273aa9999ce9d0f198b8a9317611f11ddb313258aed23e3feff339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b7225\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:03Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.389748 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653394-4b4d-4c44-bc9d-39f2eeadbee4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e08c778826ca87eedf7169382d30509a5d31e132f5c91ff2cf633a24e3a7dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb226d8acfbc46b2a51a6c4ef5c04c1e17d99e9e82bad5950ccb4356fcc39eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c1d8b6512002b090f6fa191cc3dc7d55aeae6d135bca5df2c367fb2a4f68c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc4bf8d58b05d0044acc289a36a4eb6a4de51d5d0643239ff81fd7faff4531d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 04:07:50.127900 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 04:07:50.128059 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:07:50.128926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2923111326/tls.crt::/tmp/serving-cert-2923111326/tls.key\\\\\\\"\\\\nI1122 04:07:50.418529 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:07:50.432499 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:07:50.432593 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:07:50.432650 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:07:50.432686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:07:50.439773 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1122 04:07:50.439810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 04:07:50.439829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439834 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:07:50.439842 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:07:50.439844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:07:50.439864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:07:50.442112 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e25f8f28cc3aca76ae535aa6084bd1f994cbd0eb679f6ea40938a7fe456b0e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:03Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.397691 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.397746 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.397762 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.397784 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.397799 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:03Z","lastTransitionTime":"2025-11-22T04:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.408776 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32845a6f-f693-4d06-89a5-b35cd75685f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74d319bf4380d67d100b93621956d84606b64e3c4fe494e61dc658a4300bf124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09d810e4209cf7eab6c6fbb0fedd46d64aee5d2b38b710e5bf19daa5515133f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09d810e4209cf7eab6c6fbb0fedd46d64aee5d2b38b710e5bf19daa5515133f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:03Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.427733 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:03Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.442685 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c43ee45b5065b7baee9b0025b5a73b4915b4577169a35be4378acf0e7cb603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:03Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.446914 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:09:03 crc kubenswrapper[4699]: E1122 04:09:03.447047 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.460766 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fe0275b-9174-4aab-9f0f-7c00a233de69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3141a4a35fe91db661f1bbb69f481d1db9302e79a16e9bc2898f2fd5fbe0f445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e418cb4f331bd30b224110514a5d766e31fd949210ed6eb5ea3e1e04b2f62d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb5d783b1e21eb55efe9affd3962651d2bc2f2345954fa40a00e5f9b481066fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6860d3b5c86b1ad3bd55fc98a44e7fd84d66a5237df59f47319f598420b0241f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6860d3b5c86b1ad3bd55fc98a44e7fd84d66a5237df59f47319f598420b0241f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:03Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.478040 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c858c4eaa869f479d0fbd62eadd41218ca8dddc7ae5ffd82d36977acde2e76ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:03Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.489867 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:03Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.500593 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.500648 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.500664 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.500682 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.500691 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:03Z","lastTransitionTime":"2025-11-22T04:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.505812 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pmtb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3db78d8652d86af236e2b210210af39f3c90f31425810390e79391e581d0cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5af0f83551d8cf679ee04fbc3995afe66769f74480211fb104ebf2d6d0f9ab8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:08:39Z\\\",\\\"message\\\":\\\"2025-11-22T04:07:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_60dfe7ea-f5eb-4363-a49b-b3c5f3ab720c\\\\n2025-11-22T04:07:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_60dfe7ea-f5eb-4363-a49b-b3c5f3ab720c to /host/opt/cni/bin/\\\\n2025-11-22T04:07:54Z [verbose] multus-daemon started\\\\n2025-11-22T04:07:54Z [verbose] Readiness Indicator file check\\\\n2025-11-22T04:08:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccx9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pmtb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:03Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.604642 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.604710 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.604731 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.604758 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.604778 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:03Z","lastTransitionTime":"2025-11-22T04:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.707858 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.707920 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.707943 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.708027 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.708044 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:03Z","lastTransitionTime":"2025-11-22T04:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.811873 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.811938 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.811955 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.811980 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.811997 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:03Z","lastTransitionTime":"2025-11-22T04:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.914832 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.914920 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.914941 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.914969 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:03 crc kubenswrapper[4699]: I1122 04:09:03.914988 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:03Z","lastTransitionTime":"2025-11-22T04:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.017219 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.017344 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.017376 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.017404 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.017422 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:04Z","lastTransitionTime":"2025-11-22T04:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.104803 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z7552_fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3/ovnkube-controller/3.log" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.109549 4699 scope.go:117] "RemoveContainer" containerID="bbb779ff19249c1428629a088a765868d3740d2e2ebbac18bdd170537da92af0" Nov 22 04:09:04 crc kubenswrapper[4699]: E1122 04:09:04.109829 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-z7552_openshift-ovn-kubernetes(fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.120474 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.120529 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.120545 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.120567 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.120583 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:04Z","lastTransitionTime":"2025-11-22T04:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.134192 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653394-4b4d-4c44-bc9d-39f2eeadbee4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e08c778826ca87eedf7169382d30509a5d31e132f5c91ff2cf633a24e3a7dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb226d8acfbc46b2a51a6c4ef5c04c1e17d99e9e82bad5950ccb4356fcc39eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c1d8b6512002b090f6fa191cc3dc7d55aeae6d135bca5df2c367fb2a4f68c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc4bf8d58b05d0044acc289a36a4eb6a4de51d5d0643239ff81fd7faff4531d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 04:07:50.127900 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 04:07:50.128059 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:07:50.128926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2923111326/tls.crt::/tmp/serving-cert-2923111326/tls.key\\\\\\\"\\\\nI1122 04:07:50.418529 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:07:50.432499 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:07:50.432593 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:07:50.432650 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:07:50.432686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:07:50.439773 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1122 04:07:50.439810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 04:07:50.439829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439834 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:07:50.439842 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:07:50.439844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:07:50.439864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:07:50.442112 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e25f8f28cc3aca76ae535aa6084bd1f994cbd0eb679f6ea40938a7fe456b0e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.150573 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32845a6f-f693-4d06-89a5-b35cd75685f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74d319bf4380d67d100b93621956d84606b64e3c4fe494e61dc658a4300bf124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09d810e4209cf7eab6c6fbb0fedd46d64aee5d2b38b710e5bf19daa5515133f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09d810e4209cf7eab6c6fbb0fedd46d64aee5d2b38b710e5bf19daa5515133f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.171179 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.191356 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c43ee45b5065b7baee9b0025b5a73b4915b4577169a35be4378acf0e7cb603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.206574 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h6ndp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd066499-5bd5-459c-8a02-d02f716c8965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822e0ef5b78e9c1b19b56d52c7eed8ad0058cc30b405b2adf0e2a572afdaab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hhkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h6ndp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.220289 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bdbae2-706a-4f84-9f56-5a42aec77762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc56d58ec38fe2e6ff34afa44193fd165159799c6184b7f1474c8b13087f257f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191befb5ec1036276709a4720f3cd8c40d63d14818bed55c5fac998489233619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kjwnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.224212 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.224267 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.224277 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.224298 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.224312 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:04Z","lastTransitionTime":"2025-11-22T04:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.241600 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b7225" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e5e536a-6797-4e6f-8160-1e23ddda1647\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07e7b4e6ae273aa9999ce9d0f198b8a9317611f11ddb313258aed23e3feff339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b7225\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.258754 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fe0275b-9174-4aab-9f0f-7c00a233de69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3141a4a35fe91db661f1bbb69f481d1db9302e79a16e9bc2898f2fd5fbe0f445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e418cb4f331bd30b224110514a5d766e31fd949210ed6eb5ea3e1e04b2f62d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb5d783b1e21eb55efe9affd3962651d2bc2f2345954fa40a00e5f9b481066fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6860d3b5c86b1ad3bd55fc98a44e7fd84d66a5237df59f47319f598420b0241f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6860d3b5c86b1ad3bd55fc98a44e7fd84d66a5237df59f47319f598420b0241f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.281578 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c858c4eaa869f479d0fbd62eadd41218ca8dddc7ae5ffd82d36977acde2e76ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.301853 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.318715 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pmtb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3db78d8652d86af236e2b210210af39f3c90f31425810390e79391e581d0cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5af0f83551d8cf679ee04fbc3995afe66769f74480211fb104ebf2d6d0f9ab8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:08:39Z\\\",\\\"message\\\":\\\"2025-11-22T04:07:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_60dfe7ea-f5eb-4363-a49b-b3c5f3ab720c\\\\n2025-11-22T04:07:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_60dfe7ea-f5eb-4363-a49b-b3c5f3ab720c to /host/opt/cni/bin/\\\\n2025-11-22T04:07:54Z [verbose] multus-daemon started\\\\n2025-11-22T04:07:54Z [verbose] Readiness Indicator file check\\\\n2025-11-22T04:08:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccx9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pmtb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.327842 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.327898 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.327915 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.327943 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.327961 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:04Z","lastTransitionTime":"2025-11-22T04:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.335343 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.356668 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bfcbb63b703f8f023d54028af9011b37da8d2f7c9ac57e35129cd783f301876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99bfafe09aabfb9e3715d3c7af12849e0c8cb66e5799011c8463c5043383fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.371097 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pj52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82be5d0c-6f95-43e4-aa3c-9c56de3e200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:08:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pj52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.386989 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e855881-4d77-4655-b4d7-a50fc081f993\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545a27e66130160ef1d8557458a64a27f18292c157e2e6dab9aa75aea0532ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35e9c8adb3bd9249f6d7e57cd40e40951af0463e49765ba635707120d07e8b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1538d20749062691aa2368004d22a46e612186aee24cb92acc3ddb073f616a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4a053080810e22083dda4eaba1155b7b547a214158f849f7e5778f2e37ccc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.419250 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b193b41e-aa0e-4816-b965-7b7873dadf85\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd4757f265f2b7a453efca645d83d5340e5ec206f6f9d40dd86010b90470498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1996517d6f55ae1765dd9d101fede2963e7ac51a406bca35cab95fa45192623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59408c7cd75594e068cdc4dadfec414fcc3d1604eea37ed708440fd1a4f019ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://516e9231111cee4a53c71bef07338222497c8ffb27edbfaddbcb2e58af61ae7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2097cbd81d5aedb02fafaae3f17840da75ab455e541c410ae2f70710548530ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.434172 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.434214 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.434225 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.434243 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.434255 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:04Z","lastTransitionTime":"2025-11-22T04:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.437110 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86ztb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d15248-9724-41b0-8370-66127cc18bbe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08e180e0857112708a5ca84fc45cd41b9aebc5eef5628d5666abc590d86242e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-799vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86ztb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.446885 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.446919 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.447175 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:09:04 crc kubenswrapper[4699]: E1122 04:09:04.447285 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:09:04 crc kubenswrapper[4699]: E1122 04:09:04.447170 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:09:04 crc kubenswrapper[4699]: E1122 04:09:04.447388 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.472769 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbb779ff19249c1428629a088a765868d3740d2e2ebbac18bdd170537da92af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbb779ff19249c1428629a088a765868d3740d2e2ebbac18bdd170537da92af0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:09:02Z\\\",\\\"message\\\":\\\" openshift-multus/network-metrics-daemon-pj52w\\\\nI1122 04:09:02.370975 6848 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}\\\\nI1122 04:09:02.373082 6848 services_controller.go:360] Finished syncing service dns-default on namespace openshift-dns for network=default : 4.804667ms\\\\nI1122 04:09:02.373107 6848 services_controller.go:356] Processing sync for service openshift-network-operator/metrics for network=default\\\\nI1122 04:09:02.373071 6848 services_controller.go:443] Built service openshift-machine-config-operator/machine-config-operator LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.183\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9001, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1122 04:09:02.373150 6848 services_controller.go:444] Built service openshift-machine-config-operator/machine-config-operator LB per-node configs for network=default: []services.lbConfig(nil)\\\\nF1122 04:09:02.372977 6848 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:09:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-z7552_openshift-ovn-kubernetes(fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z7552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.487542 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gqt5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686f15a0-53ce-4d3f-80e2-7d6272dc7d4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5501c17b8d8e321c7b94254ed053f943531df548575931c4ec091997d68572a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxlj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cc1c0cd69753ab441348667255f1dc34d4eae5c0579a0f84eb5d6063f7970d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxlj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:08:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gqt5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:04Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.537964 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.538031 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.538052 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.538079 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.538100 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:04Z","lastTransitionTime":"2025-11-22T04:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.640866 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.640919 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.640931 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.640956 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.640970 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:04Z","lastTransitionTime":"2025-11-22T04:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.743937 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.743989 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.744002 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.744020 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.744031 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:04Z","lastTransitionTime":"2025-11-22T04:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.847253 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.847318 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.847337 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.847367 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.847387 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:04Z","lastTransitionTime":"2025-11-22T04:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.950241 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.950321 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.950344 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.950375 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:04 crc kubenswrapper[4699]: I1122 04:09:04.950398 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:04Z","lastTransitionTime":"2025-11-22T04:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.053280 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.053331 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.053343 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.053364 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.053377 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:05Z","lastTransitionTime":"2025-11-22T04:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.156659 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.156727 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.156741 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.156760 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.156773 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:05Z","lastTransitionTime":"2025-11-22T04:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.259737 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.259806 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.259826 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.259857 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.259877 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:05Z","lastTransitionTime":"2025-11-22T04:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.363207 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.363266 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.363279 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.363301 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.363317 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:05Z","lastTransitionTime":"2025-11-22T04:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.446935 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:09:05 crc kubenswrapper[4699]: E1122 04:09:05.447157 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.466259 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.466327 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.466353 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.466382 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.466477 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:05Z","lastTransitionTime":"2025-11-22T04:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.569920 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.570007 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.570033 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.570062 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.570084 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:05Z","lastTransitionTime":"2025-11-22T04:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.674019 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.674131 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.674153 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.674183 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.674211 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:05Z","lastTransitionTime":"2025-11-22T04:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.777419 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.777499 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.777511 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.777537 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.777552 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:05Z","lastTransitionTime":"2025-11-22T04:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.880625 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.880693 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.880714 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.880738 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.880759 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:05Z","lastTransitionTime":"2025-11-22T04:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.984575 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.984677 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.984695 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.984725 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:05 crc kubenswrapper[4699]: I1122 04:09:05.984743 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:05Z","lastTransitionTime":"2025-11-22T04:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:06 crc kubenswrapper[4699]: I1122 04:09:06.088204 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:06 crc kubenswrapper[4699]: I1122 04:09:06.088286 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:06 crc kubenswrapper[4699]: I1122 04:09:06.088305 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:06 crc kubenswrapper[4699]: I1122 04:09:06.088331 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:06 crc kubenswrapper[4699]: I1122 04:09:06.088351 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:06Z","lastTransitionTime":"2025-11-22T04:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:06 crc kubenswrapper[4699]: I1122 04:09:06.191737 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:06 crc kubenswrapper[4699]: I1122 04:09:06.191789 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:06 crc kubenswrapper[4699]: I1122 04:09:06.191802 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:06 crc kubenswrapper[4699]: I1122 04:09:06.191820 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:06 crc kubenswrapper[4699]: I1122 04:09:06.191835 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:06Z","lastTransitionTime":"2025-11-22T04:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:06 crc kubenswrapper[4699]: I1122 04:09:06.302162 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:06 crc kubenswrapper[4699]: I1122 04:09:06.302238 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:06 crc kubenswrapper[4699]: I1122 04:09:06.302250 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:06 crc kubenswrapper[4699]: I1122 04:09:06.302279 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:06 crc kubenswrapper[4699]: I1122 04:09:06.302290 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:06Z","lastTransitionTime":"2025-11-22T04:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:06 crc kubenswrapper[4699]: I1122 04:09:06.405699 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:06 crc kubenswrapper[4699]: I1122 04:09:06.405779 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:06 crc kubenswrapper[4699]: I1122 04:09:06.405801 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:06 crc kubenswrapper[4699]: I1122 04:09:06.405830 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:06 crc kubenswrapper[4699]: I1122 04:09:06.405853 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:06Z","lastTransitionTime":"2025-11-22T04:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:06 crc kubenswrapper[4699]: I1122 04:09:06.447611 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:09:06 crc kubenswrapper[4699]: I1122 04:09:06.447917 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:09:06 crc kubenswrapper[4699]: I1122 04:09:06.448067 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:09:06 crc kubenswrapper[4699]: E1122 04:09:06.448140 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:09:06 crc kubenswrapper[4699]: E1122 04:09:06.448315 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:09:06 crc kubenswrapper[4699]: E1122 04:09:06.448534 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:09:06 crc kubenswrapper[4699]: I1122 04:09:06.510096 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:06 crc kubenswrapper[4699]: I1122 04:09:06.510175 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:06 crc kubenswrapper[4699]: I1122 04:09:06.510200 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:06 crc kubenswrapper[4699]: I1122 04:09:06.510232 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:06 crc kubenswrapper[4699]: I1122 04:09:06.510273 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:06Z","lastTransitionTime":"2025-11-22T04:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:06 crc kubenswrapper[4699]: I1122 04:09:06.614202 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:06 crc kubenswrapper[4699]: I1122 04:09:06.614283 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:06 crc kubenswrapper[4699]: I1122 04:09:06.614306 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:06 crc kubenswrapper[4699]: I1122 04:09:06.614339 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:06 crc kubenswrapper[4699]: I1122 04:09:06.614363 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:06Z","lastTransitionTime":"2025-11-22T04:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:06 crc kubenswrapper[4699]: I1122 04:09:06.718389 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:06 crc kubenswrapper[4699]: I1122 04:09:06.718452 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:06 crc kubenswrapper[4699]: I1122 04:09:06.718469 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:06 crc kubenswrapper[4699]: I1122 04:09:06.718496 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:06 crc kubenswrapper[4699]: I1122 04:09:06.718512 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:06Z","lastTransitionTime":"2025-11-22T04:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:06 crc kubenswrapper[4699]: I1122 04:09:06.822055 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:06 crc kubenswrapper[4699]: I1122 04:09:06.822105 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:06 crc kubenswrapper[4699]: I1122 04:09:06.822118 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:06 crc kubenswrapper[4699]: I1122 04:09:06.822135 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:06 crc kubenswrapper[4699]: I1122 04:09:06.822148 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:06Z","lastTransitionTime":"2025-11-22T04:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:06 crc kubenswrapper[4699]: I1122 04:09:06.925870 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:06 crc kubenswrapper[4699]: I1122 04:09:06.925913 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:06 crc kubenswrapper[4699]: I1122 04:09:06.925925 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:06 crc kubenswrapper[4699]: I1122 04:09:06.925944 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:06 crc kubenswrapper[4699]: I1122 04:09:06.925955 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:06Z","lastTransitionTime":"2025-11-22T04:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.029499 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.029554 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.029566 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.029591 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.029603 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:07Z","lastTransitionTime":"2025-11-22T04:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.132557 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.132627 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.132645 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.132669 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.132686 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:07Z","lastTransitionTime":"2025-11-22T04:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.236908 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.237618 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.237635 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.237662 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.237677 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:07Z","lastTransitionTime":"2025-11-22T04:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.340915 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.340976 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.340991 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.341013 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.341029 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:07Z","lastTransitionTime":"2025-11-22T04:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.443687 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.443723 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.443741 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.443760 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.443772 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:07Z","lastTransitionTime":"2025-11-22T04:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.447296 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:09:07 crc kubenswrapper[4699]: E1122 04:09:07.447404 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.546938 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.546992 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.547006 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.547028 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.547040 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:07Z","lastTransitionTime":"2025-11-22T04:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.650712 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.650819 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.650840 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.650886 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.650914 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:07Z","lastTransitionTime":"2025-11-22T04:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.754087 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.754147 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.754156 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.754170 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.754181 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:07Z","lastTransitionTime":"2025-11-22T04:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.857530 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.857581 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.857595 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.857615 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.857629 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:07Z","lastTransitionTime":"2025-11-22T04:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.960866 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.960921 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.960931 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.960949 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:07 crc kubenswrapper[4699]: I1122 04:09:07.960960 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:07Z","lastTransitionTime":"2025-11-22T04:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.063475 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.063535 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.063553 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.063572 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.063584 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:08Z","lastTransitionTime":"2025-11-22T04:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.167274 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.167337 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.167354 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.167380 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.167398 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:08Z","lastTransitionTime":"2025-11-22T04:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.270821 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.270883 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.270896 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.270918 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.270935 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:08Z","lastTransitionTime":"2025-11-22T04:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.374016 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.374092 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.374109 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.374136 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.374155 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:08Z","lastTransitionTime":"2025-11-22T04:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.447726 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.447774 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.447774 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:09:08 crc kubenswrapper[4699]: E1122 04:09:08.448197 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:09:08 crc kubenswrapper[4699]: E1122 04:09:08.448383 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:09:08 crc kubenswrapper[4699]: E1122 04:09:08.448540 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.461422 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.461508 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.461526 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.461552 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.461571 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:08Z","lastTransitionTime":"2025-11-22T04:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:08 crc kubenswrapper[4699]: E1122 04:09:08.482214 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:09:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:09:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4852b328-c4f8-4280-9881-83927c94bf9a\\\",\\\"systemUUID\\\":\\\"76c96961-7d99-459e-9731-5ae805318244\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:08Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.487530 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.487668 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.487782 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.487917 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.488032 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:08Z","lastTransitionTime":"2025-11-22T04:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:08 crc kubenswrapper[4699]: E1122 04:09:08.502818 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:09:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:09:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4852b328-c4f8-4280-9881-83927c94bf9a\\\",\\\"systemUUID\\\":\\\"76c96961-7d99-459e-9731-5ae805318244\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:08Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.507350 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.507526 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.507650 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.507769 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.507891 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:08Z","lastTransitionTime":"2025-11-22T04:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:08 crc kubenswrapper[4699]: E1122 04:09:08.524385 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:09:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:09:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4852b328-c4f8-4280-9881-83927c94bf9a\\\",\\\"systemUUID\\\":\\\"76c96961-7d99-459e-9731-5ae805318244\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:08Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.533421 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.533487 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.533511 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.533532 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.533547 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:08Z","lastTransitionTime":"2025-11-22T04:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:08 crc kubenswrapper[4699]: E1122 04:09:08.551612 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:09:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:09:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4852b328-c4f8-4280-9881-83927c94bf9a\\\",\\\"systemUUID\\\":\\\"76c96961-7d99-459e-9731-5ae805318244\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:08Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.556740 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.556781 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.556795 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.556817 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.556834 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:08Z","lastTransitionTime":"2025-11-22T04:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:08 crc kubenswrapper[4699]: E1122 04:09:08.571812 4699 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:09:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:09:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T04:09:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T04:09:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4852b328-c4f8-4280-9881-83927c94bf9a\\\",\\\"systemUUID\\\":\\\"76c96961-7d99-459e-9731-5ae805318244\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:08Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:08 crc kubenswrapper[4699]: E1122 04:09:08.571996 4699 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.573705 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.573746 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.573759 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.573778 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.573793 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:08Z","lastTransitionTime":"2025-11-22T04:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.676922 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.676971 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.676981 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.677001 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.677011 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:08Z","lastTransitionTime":"2025-11-22T04:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.780602 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.780676 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.780699 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.780725 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.780744 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:08Z","lastTransitionTime":"2025-11-22T04:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.883564 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.883612 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.883625 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.883644 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.883660 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:08Z","lastTransitionTime":"2025-11-22T04:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.986171 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.986238 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.986255 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.986277 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:08 crc kubenswrapper[4699]: I1122 04:09:08.986293 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:08Z","lastTransitionTime":"2025-11-22T04:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.089109 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.089161 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.089172 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.089192 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.089204 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:09Z","lastTransitionTime":"2025-11-22T04:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.192413 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.192495 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.192693 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.192720 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.192736 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:09Z","lastTransitionTime":"2025-11-22T04:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.296259 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.296306 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.296319 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.296336 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.296349 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:09Z","lastTransitionTime":"2025-11-22T04:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.342930 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/82be5d0c-6f95-43e4-aa3c-9c56de3e200c-metrics-certs\") pod \"network-metrics-daemon-pj52w\" (UID: \"82be5d0c-6f95-43e4-aa3c-9c56de3e200c\") " pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:09:09 crc kubenswrapper[4699]: E1122 04:09:09.343081 4699 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 04:09:09 crc kubenswrapper[4699]: E1122 04:09:09.343147 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82be5d0c-6f95-43e4-aa3c-9c56de3e200c-metrics-certs podName:82be5d0c-6f95-43e4-aa3c-9c56de3e200c nodeName:}" failed. No retries permitted until 2025-11-22 04:10:13.343128226 +0000 UTC m=+164.685749413 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/82be5d0c-6f95-43e4-aa3c-9c56de3e200c-metrics-certs") pod "network-metrics-daemon-pj52w" (UID: "82be5d0c-6f95-43e4-aa3c-9c56de3e200c") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.398812 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.398847 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.398857 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.398873 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.398885 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:09Z","lastTransitionTime":"2025-11-22T04:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.447681 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:09:09 crc kubenswrapper[4699]: E1122 04:09:09.447862 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.464235 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bdbae2-706a-4f84-9f56-5a42aec77762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc56d58ec38fe2e6ff34afa44193fd165159799c6184b7f1474c8b13087f257f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191befb5ec1036276709a4720f3cd8c40d63d14818bed55c5fac998489233619\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dtp5d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kjwnt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.514746 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b7225" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e5e536a-6797-4e6f-8160-1e23ddda1647\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07e7b4e6ae273aa9999ce9d0f198b8a9317611f11ddb313258aed23e3feff339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f128cadcfb0a4df0653ea593a4c57a41f9cf6655ceca624056c96c1151a2dc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bc5d9acbea845350c7d6b452aba02cbc6facc274bca9087d140f12e77545d64\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df49509d9dea8cc04da93b47beb47293467201be4b5bd609ae2c4f9f09cccfb6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5a5c287bee87038f10453e7901450084f47c02249bdb3c4ad1f5b53a52df4e7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e23de4df7d03e957a1dc68e031280da6ff795dcc8142b9ab780171e0e4f1a0c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854f22e07373dbd243f2dd8995f5ea0ec1a19e706e7e3d69962a74294cc1ab03\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:08:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbkvv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b7225\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.515772 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.515826 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.516021 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.516093 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.516112 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:09Z","lastTransitionTime":"2025-11-22T04:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.530665 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4653394-4b4d-4c44-bc9d-39f2eeadbee4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e08c778826ca87eedf7169382d30509a5d31e132f5c91ff2cf633a24e3a7dcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb226d8acfbc46b2a51a6c4ef5c04c1e17d99e9e82bad5950ccb4356fcc39eba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8c1d8b6512002b090f6fa191cc3dc7d55aeae6d135bca5df2c367fb2a4f68c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc4bf8d58b05d0044acc289a36a4eb6a4de51d5d0643239ff81fd7faff4531d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a830ee0215f9be64c00b7684e9a3cc3bb18fd71d60b1f63fb24da9e8d876589f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 04:07:50.127900 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 04:07:50.128059 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 04:07:50.128926 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2923111326/tls.crt::/tmp/serving-cert-2923111326/tls.key\\\\\\\"\\\\nI1122 04:07:50.418529 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 04:07:50.432499 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 04:07:50.432593 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 04:07:50.432650 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 04:07:50.432686 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 04:07:50.439773 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1122 04:07:50.439810 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 04:07:50.439829 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439834 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 04:07:50.439838 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 04:07:50.439842 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 04:07:50.439844 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 04:07:50.439864 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 04:07:50.442112 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e25f8f28cc3aca76ae535aa6084bd1f994cbd0eb679f6ea40938a7fe456b0e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://736b71e68cd911050ab426dd8560dcf1828353a8da0e185be6f6f7cbc83e6689\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.540750 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32845a6f-f693-4d06-89a5-b35cd75685f2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74d319bf4380d67d100b93621956d84606b64e3c4fe494e61dc658a4300bf124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09d810e4209cf7eab6c6fbb0fedd46d64aee5d2b38b710e5bf19daa5515133f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://09d810e4209cf7eab6c6fbb0fedd46d64aee5d2b38b710e5bf19daa5515133f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.552028 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.564514 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09c43ee45b5065b7baee9b0025b5a73b4915b4577169a35be4378acf0e7cb603\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.577334 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-h6ndp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd066499-5bd5-459c-8a02-d02f716c8965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822e0ef5b78e9c1b19b56d52c7eed8ad0058cc30b405b2adf0e2a572afdaab5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9hhkm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-h6ndp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.591060 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9fe0275b-9174-4aab-9f0f-7c00a233de69\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3141a4a35fe91db661f1bbb69f481d1db9302e79a16e9bc2898f2fd5fbe0f445\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e418cb4f331bd30b224110514a5d766e31fd949210ed6eb5ea3e1e04b2f62d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb5d783b1e21eb55efe9affd3962651d2bc2f2345954fa40a00e5f9b481066fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6860d3b5c86b1ad3bd55fc98a44e7fd84d66a5237df59f47319f598420b0241f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6860d3b5c86b1ad3bd55fc98a44e7fd84d66a5237df59f47319f598420b0241f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.605035 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c858c4eaa869f479d0fbd62eadd41218ca8dddc7ae5ffd82d36977acde2e76ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.618895 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.619086 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.619200 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.619232 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.619249 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:09Z","lastTransitionTime":"2025-11-22T04:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.619012 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.633967 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pmtb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c5f530d5-6f69-4838-a0dd-f4662ddbf85c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3db78d8652d86af236e2b210210af39f3c90f31425810390e79391e581d0cf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5af0f83551d8cf679ee04fbc3995afe66769f74480211fb104ebf2d6d0f9ab8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:08:39Z\\\",\\\"message\\\":\\\"2025-11-22T04:07:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_60dfe7ea-f5eb-4363-a49b-b3c5f3ab720c\\\\n2025-11-22T04:07:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_60dfe7ea-f5eb-4363-a49b-b3c5f3ab720c to /host/opt/cni/bin/\\\\n2025-11-22T04:07:54Z [verbose] multus-daemon started\\\\n2025-11-22T04:07:54Z [verbose] Readiness Indicator file check\\\\n2025-11-22T04:08:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccx9x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pmtb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.647897 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.661857 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bfcbb63b703f8f023d54028af9011b37da8d2f7c9ac57e35129cd783f301876\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99bfafe09aabfb9e3715d3c7af12849e0c8cb66e5799011c8463c5043383fee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.675897 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pj52w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82be5d0c-6f95-43e4-aa3c-9c56de3e200c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77tk6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:08:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pj52w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.689725 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e855881-4d77-4655-b4d7-a50fc081f993\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://545a27e66130160ef1d8557458a64a27f18292c157e2e6dab9aa75aea0532ee0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35e9c8adb3bd9249f6d7e57cd40e40951af0463e49765ba635707120d07e8b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1538d20749062691aa2368004d22a46e612186aee24cb92acc3ddb073f616a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e4a053080810e22083dda4eaba1155b7b547a214158f849f7e5778f2e37ccc0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.708356 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b193b41e-aa0e-4816-b965-7b7873dadf85\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cd4757f265f2b7a453efca645d83d5340e5ec206f6f9d40dd86010b90470498\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1996517d6f55ae1765dd9d101fede2963e7ac51a406bca35cab95fa45192623a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59408c7cd75594e068cdc4dadfec414fcc3d1604eea37ed708440fd1a4f019ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://516e9231111cee4a53c71bef07338222497c8ffb27edbfaddbcb2e58af61ae7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2097cbd81d5aedb02fafaae3f17840da75ab455e541c410ae2f70710548530ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3317c0f27aedf4b058d3691eecc6137c5eb326b39c39296a803aaf5082c696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c7522bf201e8773f383b4c1360332af48b4bb55e42c9275d2b938f1ec9cc7d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd0fce682f3a6a2a81e18f0bf7af79d875f7f110c0df5e8fbb5b3987a26abab4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:29Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.721507 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.721768 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.721836 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.721924 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.722033 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:09Z","lastTransitionTime":"2025-11-22T04:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.724054 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86ztb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34d15248-9724-41b0-8370-66127cc18bbe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08e180e0857112708a5ca84fc45cd41b9aebc5eef5628d5666abc590d86242e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-799vb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86ztb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.743790 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbb779ff19249c1428629a088a765868d3740d2e2ebbac18bdd170537da92af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bbb779ff19249c1428629a088a765868d3740d2e2ebbac18bdd170537da92af0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T04:09:02Z\\\",\\\"message\\\":\\\" openshift-multus/network-metrics-daemon-pj52w\\\\nI1122 04:09:02.370975 6848 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}\\\\nI1122 04:09:02.373082 6848 services_controller.go:360] Finished syncing service dns-default on namespace openshift-dns for network=default : 4.804667ms\\\\nI1122 04:09:02.373107 6848 services_controller.go:356] Processing sync for service openshift-network-operator/metrics for network=default\\\\nI1122 04:09:02.373071 6848 services_controller.go:443] Built service openshift-machine-config-operator/machine-config-operator LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.183\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9001, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1122 04:09:02.373150 6848 services_controller.go:444] Built service openshift-machine-config-operator/machine-config-operator LB per-node configs for network=default: []services.lbConfig(nil)\\\\nF1122 04:09:02.372977 6848 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T04:09:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-z7552_openshift-ovn-kubernetes(fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T04:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T04:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-km2cd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:07:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z7552\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.757028 4699 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gqt5x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"686f15a0-53ce-4d3f-80e2-7d6272dc7d4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T04:08:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5501c17b8d8e321c7b94254ed053f943531df548575931c4ec091997d68572a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxlj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17cc1c0cd69753ab441348667255f1dc34d4eae5c0579a0f84eb5d6063f7970d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T04:08:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mxlj2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T04:08:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gqt5x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T04:09:09Z is after 2025-08-24T17:21:41Z" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.824572 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.824813 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.824994 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.825104 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.825179 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:09Z","lastTransitionTime":"2025-11-22T04:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.928226 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.928302 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.928329 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.928362 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:09 crc kubenswrapper[4699]: I1122 04:09:09.928384 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:09Z","lastTransitionTime":"2025-11-22T04:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.031855 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.031909 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.031920 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.031939 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.031953 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:10Z","lastTransitionTime":"2025-11-22T04:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.133882 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.133968 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.133988 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.134015 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.134036 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:10Z","lastTransitionTime":"2025-11-22T04:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.236657 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.236730 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.236749 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.236780 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.236799 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:10Z","lastTransitionTime":"2025-11-22T04:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.340615 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.340692 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.340705 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.340723 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.340735 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:10Z","lastTransitionTime":"2025-11-22T04:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.443273 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.443523 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.443617 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.443687 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.443744 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:10Z","lastTransitionTime":"2025-11-22T04:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.447620 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.447624 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.447704 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:09:10 crc kubenswrapper[4699]: E1122 04:09:10.447844 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:09:10 crc kubenswrapper[4699]: E1122 04:09:10.448105 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:09:10 crc kubenswrapper[4699]: E1122 04:09:10.448321 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.549015 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.549058 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.549070 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.549099 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.549114 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:10Z","lastTransitionTime":"2025-11-22T04:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.651977 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.652034 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.652047 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.652064 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.652077 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:10Z","lastTransitionTime":"2025-11-22T04:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.755313 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.755381 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.755394 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.755422 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.755460 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:10Z","lastTransitionTime":"2025-11-22T04:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.859127 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.859579 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.859810 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.860016 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.860187 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:10Z","lastTransitionTime":"2025-11-22T04:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.962753 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.962810 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.962824 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.962841 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:10 crc kubenswrapper[4699]: I1122 04:09:10.962854 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:10Z","lastTransitionTime":"2025-11-22T04:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.065050 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.065084 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.065093 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.065109 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.065119 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:11Z","lastTransitionTime":"2025-11-22T04:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.168939 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.169016 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.169035 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.169067 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.169093 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:11Z","lastTransitionTime":"2025-11-22T04:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.272181 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.272215 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.272224 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.272238 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.272247 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:11Z","lastTransitionTime":"2025-11-22T04:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.375464 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.375521 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.375539 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.375558 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.375573 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:11Z","lastTransitionTime":"2025-11-22T04:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.447579 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:09:11 crc kubenswrapper[4699]: E1122 04:09:11.447968 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.478279 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.478336 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.478352 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.478371 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.478383 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:11Z","lastTransitionTime":"2025-11-22T04:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.581182 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.581241 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.581251 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.581269 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.581281 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:11Z","lastTransitionTime":"2025-11-22T04:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.684117 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.684277 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.684300 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.684323 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.684340 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:11Z","lastTransitionTime":"2025-11-22T04:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.786964 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.787036 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.787062 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.787088 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.787107 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:11Z","lastTransitionTime":"2025-11-22T04:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.890099 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.890149 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.890160 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.890181 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.890194 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:11Z","lastTransitionTime":"2025-11-22T04:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.993646 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.993772 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.993793 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.993816 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:11 crc kubenswrapper[4699]: I1122 04:09:11.993834 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:11Z","lastTransitionTime":"2025-11-22T04:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:12 crc kubenswrapper[4699]: I1122 04:09:12.096761 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:12 crc kubenswrapper[4699]: I1122 04:09:12.096851 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:12 crc kubenswrapper[4699]: I1122 04:09:12.096874 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:12 crc kubenswrapper[4699]: I1122 04:09:12.096902 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:12 crc kubenswrapper[4699]: I1122 04:09:12.096920 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:12Z","lastTransitionTime":"2025-11-22T04:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:12 crc kubenswrapper[4699]: I1122 04:09:12.199287 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:12 crc kubenswrapper[4699]: I1122 04:09:12.199347 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:12 crc kubenswrapper[4699]: I1122 04:09:12.199357 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:12 crc kubenswrapper[4699]: I1122 04:09:12.199372 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:12 crc kubenswrapper[4699]: I1122 04:09:12.199382 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:12Z","lastTransitionTime":"2025-11-22T04:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:12 crc kubenswrapper[4699]: I1122 04:09:12.302924 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:12 crc kubenswrapper[4699]: I1122 04:09:12.302995 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:12 crc kubenswrapper[4699]: I1122 04:09:12.303013 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:12 crc kubenswrapper[4699]: I1122 04:09:12.303039 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:12 crc kubenswrapper[4699]: I1122 04:09:12.303057 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:12Z","lastTransitionTime":"2025-11-22T04:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:12 crc kubenswrapper[4699]: I1122 04:09:12.405970 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:12 crc kubenswrapper[4699]: I1122 04:09:12.406014 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:12 crc kubenswrapper[4699]: I1122 04:09:12.406026 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:12 crc kubenswrapper[4699]: I1122 04:09:12.406044 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:12 crc kubenswrapper[4699]: I1122 04:09:12.406055 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:12Z","lastTransitionTime":"2025-11-22T04:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:12 crc kubenswrapper[4699]: I1122 04:09:12.447907 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:09:12 crc kubenswrapper[4699]: I1122 04:09:12.448024 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:09:12 crc kubenswrapper[4699]: I1122 04:09:12.448024 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:09:12 crc kubenswrapper[4699]: E1122 04:09:12.448181 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:09:12 crc kubenswrapper[4699]: E1122 04:09:12.449086 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:09:12 crc kubenswrapper[4699]: E1122 04:09:12.449018 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:09:12 crc kubenswrapper[4699]: I1122 04:09:12.509497 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:12 crc kubenswrapper[4699]: I1122 04:09:12.509558 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:12 crc kubenswrapper[4699]: I1122 04:09:12.509584 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:12 crc kubenswrapper[4699]: I1122 04:09:12.509601 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:12 crc kubenswrapper[4699]: I1122 04:09:12.509620 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:12Z","lastTransitionTime":"2025-11-22T04:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:12 crc kubenswrapper[4699]: I1122 04:09:12.613773 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:12 crc kubenswrapper[4699]: I1122 04:09:12.613845 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:12 crc kubenswrapper[4699]: I1122 04:09:12.613865 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:12 crc kubenswrapper[4699]: I1122 04:09:12.613998 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:12 crc kubenswrapper[4699]: I1122 04:09:12.614020 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:12Z","lastTransitionTime":"2025-11-22T04:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:12 crc kubenswrapper[4699]: I1122 04:09:12.718410 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:12 crc kubenswrapper[4699]: I1122 04:09:12.718489 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:12 crc kubenswrapper[4699]: I1122 04:09:12.718503 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:12 crc kubenswrapper[4699]: I1122 04:09:12.718524 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:12 crc kubenswrapper[4699]: I1122 04:09:12.718537 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:12Z","lastTransitionTime":"2025-11-22T04:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:12 crc kubenswrapper[4699]: I1122 04:09:12.822939 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:12 crc kubenswrapper[4699]: I1122 04:09:12.823064 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:12 crc kubenswrapper[4699]: I1122 04:09:12.823083 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:12 crc kubenswrapper[4699]: I1122 04:09:12.823111 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:12 crc kubenswrapper[4699]: I1122 04:09:12.823133 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:12Z","lastTransitionTime":"2025-11-22T04:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:12 crc kubenswrapper[4699]: I1122 04:09:12.926377 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:12 crc kubenswrapper[4699]: I1122 04:09:12.926481 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:12 crc kubenswrapper[4699]: I1122 04:09:12.926510 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:12 crc kubenswrapper[4699]: I1122 04:09:12.926541 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:12 crc kubenswrapper[4699]: I1122 04:09:12.926564 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:12Z","lastTransitionTime":"2025-11-22T04:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.030346 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.030409 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.030425 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.030495 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.030515 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:13Z","lastTransitionTime":"2025-11-22T04:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.134984 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.135060 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.135085 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.135118 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.135142 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:13Z","lastTransitionTime":"2025-11-22T04:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.238650 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.238911 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.238929 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.238954 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.238970 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:13Z","lastTransitionTime":"2025-11-22T04:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.341907 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.341954 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.341966 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.341983 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.341995 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:13Z","lastTransitionTime":"2025-11-22T04:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.444516 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.444573 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.444583 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.444603 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.444617 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:13Z","lastTransitionTime":"2025-11-22T04:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.447945 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:09:13 crc kubenswrapper[4699]: E1122 04:09:13.448170 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.548474 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.548540 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.548558 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.548584 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.548604 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:13Z","lastTransitionTime":"2025-11-22T04:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.651255 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.651335 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.651354 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.651386 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.651408 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:13Z","lastTransitionTime":"2025-11-22T04:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.755370 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.755457 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.755476 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.755503 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.755519 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:13Z","lastTransitionTime":"2025-11-22T04:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.858868 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.858906 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.858924 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.858945 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.858957 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:13Z","lastTransitionTime":"2025-11-22T04:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.962136 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.962188 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.962201 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.962223 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:13 crc kubenswrapper[4699]: I1122 04:09:13.962238 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:13Z","lastTransitionTime":"2025-11-22T04:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.065069 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.065144 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.065162 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.065193 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.065214 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:14Z","lastTransitionTime":"2025-11-22T04:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.168394 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.168485 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.168504 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.168531 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.168550 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:14Z","lastTransitionTime":"2025-11-22T04:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.271817 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.271857 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.271866 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.271881 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.271891 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:14Z","lastTransitionTime":"2025-11-22T04:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.375405 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.375504 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.375522 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.375547 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.375565 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:14Z","lastTransitionTime":"2025-11-22T04:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.447312 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.447343 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:09:14 crc kubenswrapper[4699]: E1122 04:09:14.447478 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.447559 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:09:14 crc kubenswrapper[4699]: E1122 04:09:14.447675 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:09:14 crc kubenswrapper[4699]: E1122 04:09:14.447844 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.478877 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.478937 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.478956 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.478979 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.478998 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:14Z","lastTransitionTime":"2025-11-22T04:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.581936 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.582017 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.582029 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.582050 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.582064 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:14Z","lastTransitionTime":"2025-11-22T04:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.685780 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.685837 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.685850 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.685870 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.685880 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:14Z","lastTransitionTime":"2025-11-22T04:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.789687 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.789744 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.789756 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.789775 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.789787 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:14Z","lastTransitionTime":"2025-11-22T04:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.892902 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.892987 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.893001 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.893023 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.893036 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:14Z","lastTransitionTime":"2025-11-22T04:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.996444 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.996501 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.996511 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.996529 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:14 crc kubenswrapper[4699]: I1122 04:09:14.996543 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:14Z","lastTransitionTime":"2025-11-22T04:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:15 crc kubenswrapper[4699]: I1122 04:09:15.099986 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:15 crc kubenswrapper[4699]: I1122 04:09:15.100075 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:15 crc kubenswrapper[4699]: I1122 04:09:15.100112 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:15 crc kubenswrapper[4699]: I1122 04:09:15.100146 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:15 crc kubenswrapper[4699]: I1122 04:09:15.100171 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:15Z","lastTransitionTime":"2025-11-22T04:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:15 crc kubenswrapper[4699]: I1122 04:09:15.203475 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:15 crc kubenswrapper[4699]: I1122 04:09:15.203528 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:15 crc kubenswrapper[4699]: I1122 04:09:15.203542 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:15 crc kubenswrapper[4699]: I1122 04:09:15.203567 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:15 crc kubenswrapper[4699]: I1122 04:09:15.203591 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:15Z","lastTransitionTime":"2025-11-22T04:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:15 crc kubenswrapper[4699]: I1122 04:09:15.306232 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:15 crc kubenswrapper[4699]: I1122 04:09:15.306282 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:15 crc kubenswrapper[4699]: I1122 04:09:15.306297 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:15 crc kubenswrapper[4699]: I1122 04:09:15.306315 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:15 crc kubenswrapper[4699]: I1122 04:09:15.306333 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:15Z","lastTransitionTime":"2025-11-22T04:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:15 crc kubenswrapper[4699]: I1122 04:09:15.409512 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:15 crc kubenswrapper[4699]: I1122 04:09:15.409570 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:15 crc kubenswrapper[4699]: I1122 04:09:15.409579 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:15 crc kubenswrapper[4699]: I1122 04:09:15.409599 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:15 crc kubenswrapper[4699]: I1122 04:09:15.409614 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:15Z","lastTransitionTime":"2025-11-22T04:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:15 crc kubenswrapper[4699]: I1122 04:09:15.447272 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:09:15 crc kubenswrapper[4699]: E1122 04:09:15.447921 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:09:15 crc kubenswrapper[4699]: I1122 04:09:15.512722 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:15 crc kubenswrapper[4699]: I1122 04:09:15.512794 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:15 crc kubenswrapper[4699]: I1122 04:09:15.512817 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:15 crc kubenswrapper[4699]: I1122 04:09:15.512848 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:15 crc kubenswrapper[4699]: I1122 04:09:15.512874 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:15Z","lastTransitionTime":"2025-11-22T04:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:15 crc kubenswrapper[4699]: I1122 04:09:15.616719 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:15 crc kubenswrapper[4699]: I1122 04:09:15.616810 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:15 crc kubenswrapper[4699]: I1122 04:09:15.616833 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:15 crc kubenswrapper[4699]: I1122 04:09:15.616864 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:15 crc kubenswrapper[4699]: I1122 04:09:15.616884 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:15Z","lastTransitionTime":"2025-11-22T04:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:15 crc kubenswrapper[4699]: I1122 04:09:15.720232 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:15 crc kubenswrapper[4699]: I1122 04:09:15.720319 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:15 crc kubenswrapper[4699]: I1122 04:09:15.720345 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:15 crc kubenswrapper[4699]: I1122 04:09:15.720371 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:15 crc kubenswrapper[4699]: I1122 04:09:15.720389 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:15Z","lastTransitionTime":"2025-11-22T04:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:15 crc kubenswrapper[4699]: I1122 04:09:15.824897 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:15 crc kubenswrapper[4699]: I1122 04:09:15.825056 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:15 crc kubenswrapper[4699]: I1122 04:09:15.825085 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:15 crc kubenswrapper[4699]: I1122 04:09:15.825193 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:15 crc kubenswrapper[4699]: I1122 04:09:15.825288 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:15Z","lastTransitionTime":"2025-11-22T04:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:15 crc kubenswrapper[4699]: I1122 04:09:15.929102 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:15 crc kubenswrapper[4699]: I1122 04:09:15.929176 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:15 crc kubenswrapper[4699]: I1122 04:09:15.929200 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:15 crc kubenswrapper[4699]: I1122 04:09:15.929234 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:15 crc kubenswrapper[4699]: I1122 04:09:15.929254 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:15Z","lastTransitionTime":"2025-11-22T04:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.033194 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.033263 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.033278 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.033301 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.033315 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:16Z","lastTransitionTime":"2025-11-22T04:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.136748 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.136825 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.136838 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.136859 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.136900 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:16Z","lastTransitionTime":"2025-11-22T04:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.239865 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.239923 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.239937 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.239957 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.239974 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:16Z","lastTransitionTime":"2025-11-22T04:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.343661 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.343703 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.343713 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.343731 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.343744 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:16Z","lastTransitionTime":"2025-11-22T04:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.447069 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.447125 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.447187 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.447194 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.447212 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.447277 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.447301 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:16Z","lastTransitionTime":"2025-11-22T04:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:16 crc kubenswrapper[4699]: E1122 04:09:16.447355 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.447834 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:09:16 crc kubenswrapper[4699]: E1122 04:09:16.448248 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:09:16 crc kubenswrapper[4699]: E1122 04:09:16.448487 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.449933 4699 scope.go:117] "RemoveContainer" containerID="bbb779ff19249c1428629a088a765868d3740d2e2ebbac18bdd170537da92af0" Nov 22 04:09:16 crc kubenswrapper[4699]: E1122 04:09:16.450199 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-z7552_openshift-ovn-kubernetes(fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.551012 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.551094 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.551115 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.551142 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.551164 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:16Z","lastTransitionTime":"2025-11-22T04:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.654649 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.654727 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.654744 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.654772 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.654792 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:16Z","lastTransitionTime":"2025-11-22T04:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.757582 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.757642 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.757658 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.757682 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.757700 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:16Z","lastTransitionTime":"2025-11-22T04:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.862331 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.862400 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.862418 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.862482 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.862502 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:16Z","lastTransitionTime":"2025-11-22T04:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.965563 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.965613 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.965622 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.965639 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:16 crc kubenswrapper[4699]: I1122 04:09:16.965650 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:16Z","lastTransitionTime":"2025-11-22T04:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:17 crc kubenswrapper[4699]: I1122 04:09:17.068968 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:17 crc kubenswrapper[4699]: I1122 04:09:17.069004 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:17 crc kubenswrapper[4699]: I1122 04:09:17.069022 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:17 crc kubenswrapper[4699]: I1122 04:09:17.069045 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:17 crc kubenswrapper[4699]: I1122 04:09:17.069058 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:17Z","lastTransitionTime":"2025-11-22T04:09:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:17 crc kubenswrapper[4699]: I1122 04:09:17.171780 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:17 crc kubenswrapper[4699]: I1122 04:09:17.171851 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:17 crc kubenswrapper[4699]: I1122 04:09:17.171870 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:17 crc kubenswrapper[4699]: I1122 04:09:17.171902 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:17 crc kubenswrapper[4699]: I1122 04:09:17.171924 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:17Z","lastTransitionTime":"2025-11-22T04:09:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:17 crc kubenswrapper[4699]: I1122 04:09:17.274826 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:17 crc kubenswrapper[4699]: I1122 04:09:17.274898 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:17 crc kubenswrapper[4699]: I1122 04:09:17.274917 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:17 crc kubenswrapper[4699]: I1122 04:09:17.274943 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:17 crc kubenswrapper[4699]: I1122 04:09:17.274962 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:17Z","lastTransitionTime":"2025-11-22T04:09:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:17 crc kubenswrapper[4699]: I1122 04:09:17.377893 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:17 crc kubenswrapper[4699]: I1122 04:09:17.377963 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:17 crc kubenswrapper[4699]: I1122 04:09:17.377987 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:17 crc kubenswrapper[4699]: I1122 04:09:17.378015 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:17 crc kubenswrapper[4699]: I1122 04:09:17.378034 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:17Z","lastTransitionTime":"2025-11-22T04:09:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:17 crc kubenswrapper[4699]: I1122 04:09:17.447409 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:09:17 crc kubenswrapper[4699]: E1122 04:09:17.447584 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:09:17 crc kubenswrapper[4699]: I1122 04:09:17.481524 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:17 crc kubenswrapper[4699]: I1122 04:09:17.481562 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:17 crc kubenswrapper[4699]: I1122 04:09:17.481578 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:17 crc kubenswrapper[4699]: I1122 04:09:17.481599 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:17 crc kubenswrapper[4699]: I1122 04:09:17.481614 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:17Z","lastTransitionTime":"2025-11-22T04:09:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:17 crc kubenswrapper[4699]: I1122 04:09:17.585221 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:17 crc kubenswrapper[4699]: I1122 04:09:17.585287 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:17 crc kubenswrapper[4699]: I1122 04:09:17.585305 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:17 crc kubenswrapper[4699]: I1122 04:09:17.585332 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:17 crc kubenswrapper[4699]: I1122 04:09:17.585353 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:17Z","lastTransitionTime":"2025-11-22T04:09:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:17 crc kubenswrapper[4699]: I1122 04:09:17.689233 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:17 crc kubenswrapper[4699]: I1122 04:09:17.689277 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:17 crc kubenswrapper[4699]: I1122 04:09:17.689288 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:17 crc kubenswrapper[4699]: I1122 04:09:17.689309 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:17 crc kubenswrapper[4699]: I1122 04:09:17.689321 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:17Z","lastTransitionTime":"2025-11-22T04:09:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:17 crc kubenswrapper[4699]: I1122 04:09:17.793175 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:17 crc kubenswrapper[4699]: I1122 04:09:17.793222 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:17 crc kubenswrapper[4699]: I1122 04:09:17.793235 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:17 crc kubenswrapper[4699]: I1122 04:09:17.793254 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:17 crc kubenswrapper[4699]: I1122 04:09:17.793302 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:17Z","lastTransitionTime":"2025-11-22T04:09:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:17 crc kubenswrapper[4699]: I1122 04:09:17.896641 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:17 crc kubenswrapper[4699]: I1122 04:09:17.896685 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:17 crc kubenswrapper[4699]: I1122 04:09:17.896697 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:17 crc kubenswrapper[4699]: I1122 04:09:17.896716 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:17 crc kubenswrapper[4699]: I1122 04:09:17.896730 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:17Z","lastTransitionTime":"2025-11-22T04:09:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.000222 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.000324 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.000353 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.000388 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.000416 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:18Z","lastTransitionTime":"2025-11-22T04:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.105245 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.105318 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.105344 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.105377 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.105401 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:18Z","lastTransitionTime":"2025-11-22T04:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.207753 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.207834 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.207873 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.207910 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.207935 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:18Z","lastTransitionTime":"2025-11-22T04:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.310800 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.310881 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.310919 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.310939 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.310951 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:18Z","lastTransitionTime":"2025-11-22T04:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.414122 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.414176 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.414192 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.414215 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.414232 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:18Z","lastTransitionTime":"2025-11-22T04:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.447061 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.447160 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.447160 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:09:18 crc kubenswrapper[4699]: E1122 04:09:18.447279 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:09:18 crc kubenswrapper[4699]: E1122 04:09:18.447632 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:09:18 crc kubenswrapper[4699]: E1122 04:09:18.447746 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.516941 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.516996 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.517009 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.517024 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.517034 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:18Z","lastTransitionTime":"2025-11-22T04:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.593236 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.593307 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.593322 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.593348 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.593364 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:18Z","lastTransitionTime":"2025-11-22T04:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.622492 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.622565 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.622578 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.622619 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.622654 4699 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T04:09:18Z","lastTransitionTime":"2025-11-22T04:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.656604 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-wqtn9"] Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.657133 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wqtn9" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.660582 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.661257 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.661567 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.661577 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.732264 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-h6ndp" podStartSLOduration=88.732246049 podStartE2EDuration="1m28.732246049s" podCreationTimestamp="2025-11-22 04:07:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:09:18.713758402 +0000 UTC m=+110.056379609" watchObservedRunningTime="2025-11-22 04:09:18.732246049 +0000 UTC m=+110.074867236" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.751804 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podStartSLOduration=87.751777472 podStartE2EDuration="1m27.751777472s" podCreationTimestamp="2025-11-22 04:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:09:18.732507985 +0000 UTC m=+110.075129182" watchObservedRunningTime="2025-11-22 04:09:18.751777472 +0000 UTC m=+110.094398659" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.754896 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fae0cb9f-0bee-4292-a8fe-aea8b9bc7017-service-ca\") pod \"cluster-version-operator-5c965bbfc6-wqtn9\" (UID: \"fae0cb9f-0bee-4292-a8fe-aea8b9bc7017\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wqtn9" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.754945 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fae0cb9f-0bee-4292-a8fe-aea8b9bc7017-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-wqtn9\" (UID: \"fae0cb9f-0bee-4292-a8fe-aea8b9bc7017\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wqtn9" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.754976 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/fae0cb9f-0bee-4292-a8fe-aea8b9bc7017-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-wqtn9\" (UID: \"fae0cb9f-0bee-4292-a8fe-aea8b9bc7017\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wqtn9" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.755021 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/fae0cb9f-0bee-4292-a8fe-aea8b9bc7017-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-wqtn9\" (UID: \"fae0cb9f-0bee-4292-a8fe-aea8b9bc7017\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wqtn9" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.755036 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fae0cb9f-0bee-4292-a8fe-aea8b9bc7017-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-wqtn9\" (UID: \"fae0cb9f-0bee-4292-a8fe-aea8b9bc7017\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wqtn9" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.773295 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=87.773261682 podStartE2EDuration="1m27.773261682s" podCreationTimestamp="2025-11-22 04:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:09:18.773013516 +0000 UTC m=+110.115634743" watchObservedRunningTime="2025-11-22 04:09:18.773261682 +0000 UTC m=+110.115882869" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.773732 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-b7225" podStartSLOduration=87.773725693 podStartE2EDuration="1m27.773725693s" podCreationTimestamp="2025-11-22 04:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:09:18.752140881 +0000 UTC m=+110.094762068" watchObservedRunningTime="2025-11-22 04:09:18.773725693 +0000 UTC m=+110.116346880" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.787500 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=26.787472726 podStartE2EDuration="26.787472726s" podCreationTimestamp="2025-11-22 04:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:09:18.787124637 +0000 UTC m=+110.129745844" watchObservedRunningTime="2025-11-22 04:09:18.787472726 +0000 UTC m=+110.130093913" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.817051 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-pmtb4" podStartSLOduration=87.817025611 podStartE2EDuration="1m27.817025611s" podCreationTimestamp="2025-11-22 04:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:09:18.816961629 +0000 UTC m=+110.159582826" watchObservedRunningTime="2025-11-22 04:09:18.817025611 +0000 UTC m=+110.159646808" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.833257 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=61.833226803 podStartE2EDuration="1m1.833226803s" podCreationTimestamp="2025-11-22 04:08:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:09:18.832624859 +0000 UTC m=+110.175246056" watchObservedRunningTime="2025-11-22 04:09:18.833226803 +0000 UTC m=+110.175847990" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.856108 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fae0cb9f-0bee-4292-a8fe-aea8b9bc7017-service-ca\") pod \"cluster-version-operator-5c965bbfc6-wqtn9\" (UID: \"fae0cb9f-0bee-4292-a8fe-aea8b9bc7017\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wqtn9" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.856183 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fae0cb9f-0bee-4292-a8fe-aea8b9bc7017-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-wqtn9\" (UID: \"fae0cb9f-0bee-4292-a8fe-aea8b9bc7017\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wqtn9" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.856208 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/fae0cb9f-0bee-4292-a8fe-aea8b9bc7017-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-wqtn9\" (UID: \"fae0cb9f-0bee-4292-a8fe-aea8b9bc7017\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wqtn9" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.856269 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/fae0cb9f-0bee-4292-a8fe-aea8b9bc7017-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-wqtn9\" (UID: \"fae0cb9f-0bee-4292-a8fe-aea8b9bc7017\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wqtn9" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.856291 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fae0cb9f-0bee-4292-a8fe-aea8b9bc7017-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-wqtn9\" (UID: \"fae0cb9f-0bee-4292-a8fe-aea8b9bc7017\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wqtn9" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.856895 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/fae0cb9f-0bee-4292-a8fe-aea8b9bc7017-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-wqtn9\" (UID: \"fae0cb9f-0bee-4292-a8fe-aea8b9bc7017\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wqtn9" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.856975 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/fae0cb9f-0bee-4292-a8fe-aea8b9bc7017-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-wqtn9\" (UID: \"fae0cb9f-0bee-4292-a8fe-aea8b9bc7017\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wqtn9" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.858045 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fae0cb9f-0bee-4292-a8fe-aea8b9bc7017-service-ca\") pod \"cluster-version-operator-5c965bbfc6-wqtn9\" (UID: \"fae0cb9f-0bee-4292-a8fe-aea8b9bc7017\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wqtn9" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.866102 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fae0cb9f-0bee-4292-a8fe-aea8b9bc7017-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-wqtn9\" (UID: \"fae0cb9f-0bee-4292-a8fe-aea8b9bc7017\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wqtn9" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.877171 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fae0cb9f-0bee-4292-a8fe-aea8b9bc7017-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-wqtn9\" (UID: \"fae0cb9f-0bee-4292-a8fe-aea8b9bc7017\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wqtn9" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.947965 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-86ztb" podStartSLOduration=88.947923909 podStartE2EDuration="1m28.947923909s" podCreationTimestamp="2025-11-22 04:07:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:09:18.947875358 +0000 UTC m=+110.290496545" watchObservedRunningTime="2025-11-22 04:09:18.947923909 +0000 UTC m=+110.290545096" Nov 22 04:09:18 crc kubenswrapper[4699]: I1122 04:09:18.977351 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wqtn9" Nov 22 04:09:19 crc kubenswrapper[4699]: I1122 04:09:19.025151 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gqt5x" podStartSLOduration=88.025122868 podStartE2EDuration="1m28.025122868s" podCreationTimestamp="2025-11-22 04:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:09:19.02437517 +0000 UTC m=+110.366996367" watchObservedRunningTime="2025-11-22 04:09:19.025122868 +0000 UTC m=+110.367744055" Nov 22 04:09:19 crc kubenswrapper[4699]: I1122 04:09:19.043288 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=89.043243956 podStartE2EDuration="1m29.043243956s" podCreationTimestamp="2025-11-22 04:07:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:09:19.040677714 +0000 UTC m=+110.383298901" watchObservedRunningTime="2025-11-22 04:09:19.043243956 +0000 UTC m=+110.385865143" Nov 22 04:09:19 crc kubenswrapper[4699]: I1122 04:09:19.076847 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=87.076825829 podStartE2EDuration="1m27.076825829s" podCreationTimestamp="2025-11-22 04:07:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:09:19.074933073 +0000 UTC m=+110.417554280" watchObservedRunningTime="2025-11-22 04:09:19.076825829 +0000 UTC m=+110.419447016" Nov 22 04:09:19 crc kubenswrapper[4699]: I1122 04:09:19.171300 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wqtn9" event={"ID":"fae0cb9f-0bee-4292-a8fe-aea8b9bc7017","Type":"ContainerStarted","Data":"14b4ab4962b449450210227e584353b181c14a12a92054346484c4397324e24f"} Nov 22 04:09:19 crc kubenswrapper[4699]: I1122 04:09:19.171391 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wqtn9" event={"ID":"fae0cb9f-0bee-4292-a8fe-aea8b9bc7017","Type":"ContainerStarted","Data":"dca5eae85361ce5fcf67a07a56328dc811006c397d1f82257004eaa643af0477"} Nov 22 04:09:19 crc kubenswrapper[4699]: I1122 04:09:19.189654 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wqtn9" podStartSLOduration=88.18963257 podStartE2EDuration="1m28.18963257s" podCreationTimestamp="2025-11-22 04:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:09:19.187629801 +0000 UTC m=+110.530250998" watchObservedRunningTime="2025-11-22 04:09:19.18963257 +0000 UTC m=+110.532253757" Nov 22 04:09:19 crc kubenswrapper[4699]: I1122 04:09:19.447069 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:09:19 crc kubenswrapper[4699]: E1122 04:09:19.449074 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:09:20 crc kubenswrapper[4699]: I1122 04:09:20.447139 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:09:20 crc kubenswrapper[4699]: E1122 04:09:20.447915 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:09:20 crc kubenswrapper[4699]: I1122 04:09:20.448090 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:09:20 crc kubenswrapper[4699]: E1122 04:09:20.448217 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:09:20 crc kubenswrapper[4699]: I1122 04:09:20.448324 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:09:20 crc kubenswrapper[4699]: E1122 04:09:20.448507 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:09:21 crc kubenswrapper[4699]: I1122 04:09:21.447415 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:09:21 crc kubenswrapper[4699]: E1122 04:09:21.447629 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:09:22 crc kubenswrapper[4699]: I1122 04:09:22.447737 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:09:22 crc kubenswrapper[4699]: I1122 04:09:22.447846 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:09:22 crc kubenswrapper[4699]: E1122 04:09:22.447932 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:09:22 crc kubenswrapper[4699]: I1122 04:09:22.447738 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:09:22 crc kubenswrapper[4699]: E1122 04:09:22.448011 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:09:22 crc kubenswrapper[4699]: E1122 04:09:22.448156 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:09:23 crc kubenswrapper[4699]: I1122 04:09:23.447456 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:09:23 crc kubenswrapper[4699]: E1122 04:09:23.447675 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:09:24 crc kubenswrapper[4699]: I1122 04:09:24.447726 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:09:24 crc kubenswrapper[4699]: I1122 04:09:24.447754 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:09:24 crc kubenswrapper[4699]: E1122 04:09:24.447877 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:09:24 crc kubenswrapper[4699]: I1122 04:09:24.447726 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:09:24 crc kubenswrapper[4699]: E1122 04:09:24.448087 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:09:24 crc kubenswrapper[4699]: E1122 04:09:24.448134 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:09:25 crc kubenswrapper[4699]: I1122 04:09:25.448069 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:09:25 crc kubenswrapper[4699]: E1122 04:09:25.448309 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:09:26 crc kubenswrapper[4699]: I1122 04:09:26.221238 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pmtb4_c5f530d5-6f69-4838-a0dd-f4662ddbf85c/kube-multus/1.log" Nov 22 04:09:26 crc kubenswrapper[4699]: I1122 04:09:26.221960 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pmtb4_c5f530d5-6f69-4838-a0dd-f4662ddbf85c/kube-multus/0.log" Nov 22 04:09:26 crc kubenswrapper[4699]: I1122 04:09:26.222035 4699 generic.go:334] "Generic (PLEG): container finished" podID="c5f530d5-6f69-4838-a0dd-f4662ddbf85c" containerID="b3db78d8652d86af236e2b210210af39f3c90f31425810390e79391e581d0cf9" exitCode=1 Nov 22 04:09:26 crc kubenswrapper[4699]: I1122 04:09:26.222080 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pmtb4" event={"ID":"c5f530d5-6f69-4838-a0dd-f4662ddbf85c","Type":"ContainerDied","Data":"b3db78d8652d86af236e2b210210af39f3c90f31425810390e79391e581d0cf9"} Nov 22 04:09:26 crc kubenswrapper[4699]: I1122 04:09:26.222127 4699 scope.go:117] "RemoveContainer" containerID="f5af0f83551d8cf679ee04fbc3995afe66769f74480211fb104ebf2d6d0f9ab8" Nov 22 04:09:26 crc kubenswrapper[4699]: I1122 04:09:26.223064 4699 scope.go:117] "RemoveContainer" containerID="b3db78d8652d86af236e2b210210af39f3c90f31425810390e79391e581d0cf9" Nov 22 04:09:26 crc kubenswrapper[4699]: E1122 04:09:26.223295 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-pmtb4_openshift-multus(c5f530d5-6f69-4838-a0dd-f4662ddbf85c)\"" pod="openshift-multus/multus-pmtb4" podUID="c5f530d5-6f69-4838-a0dd-f4662ddbf85c" Nov 22 04:09:26 crc kubenswrapper[4699]: I1122 04:09:26.447156 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:09:26 crc kubenswrapper[4699]: I1122 04:09:26.447242 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:09:26 crc kubenswrapper[4699]: E1122 04:09:26.447372 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:09:26 crc kubenswrapper[4699]: E1122 04:09:26.447509 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:09:26 crc kubenswrapper[4699]: I1122 04:09:26.447270 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:09:26 crc kubenswrapper[4699]: E1122 04:09:26.447662 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:09:27 crc kubenswrapper[4699]: I1122 04:09:27.234353 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pmtb4_c5f530d5-6f69-4838-a0dd-f4662ddbf85c/kube-multus/1.log" Nov 22 04:09:27 crc kubenswrapper[4699]: I1122 04:09:27.446900 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:09:27 crc kubenswrapper[4699]: E1122 04:09:27.447049 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:09:28 crc kubenswrapper[4699]: I1122 04:09:28.447745 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:09:28 crc kubenswrapper[4699]: I1122 04:09:28.447799 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:09:28 crc kubenswrapper[4699]: I1122 04:09:28.447862 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:09:28 crc kubenswrapper[4699]: E1122 04:09:28.448821 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:09:28 crc kubenswrapper[4699]: E1122 04:09:28.448961 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:09:28 crc kubenswrapper[4699]: E1122 04:09:28.449054 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:09:29 crc kubenswrapper[4699]: E1122 04:09:29.397083 4699 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Nov 22 04:09:29 crc kubenswrapper[4699]: I1122 04:09:29.447400 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:09:29 crc kubenswrapper[4699]: E1122 04:09:29.449719 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:09:29 crc kubenswrapper[4699]: E1122 04:09:29.550972 4699 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 22 04:09:30 crc kubenswrapper[4699]: I1122 04:09:30.447686 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:09:30 crc kubenswrapper[4699]: I1122 04:09:30.447726 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:09:30 crc kubenswrapper[4699]: I1122 04:09:30.447893 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:09:30 crc kubenswrapper[4699]: E1122 04:09:30.448055 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:09:30 crc kubenswrapper[4699]: E1122 04:09:30.448246 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:09:30 crc kubenswrapper[4699]: E1122 04:09:30.448381 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:09:31 crc kubenswrapper[4699]: I1122 04:09:31.447125 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:09:31 crc kubenswrapper[4699]: E1122 04:09:31.447302 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:09:31 crc kubenswrapper[4699]: I1122 04:09:31.447924 4699 scope.go:117] "RemoveContainer" containerID="bbb779ff19249c1428629a088a765868d3740d2e2ebbac18bdd170537da92af0" Nov 22 04:09:31 crc kubenswrapper[4699]: E1122 04:09:31.448115 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-z7552_openshift-ovn-kubernetes(fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" Nov 22 04:09:32 crc kubenswrapper[4699]: I1122 04:09:32.447311 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:09:32 crc kubenswrapper[4699]: E1122 04:09:32.447507 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:09:32 crc kubenswrapper[4699]: I1122 04:09:32.447593 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:09:32 crc kubenswrapper[4699]: I1122 04:09:32.447687 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:09:32 crc kubenswrapper[4699]: E1122 04:09:32.447771 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:09:32 crc kubenswrapper[4699]: E1122 04:09:32.447828 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:09:33 crc kubenswrapper[4699]: I1122 04:09:33.446901 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:09:33 crc kubenswrapper[4699]: E1122 04:09:33.447061 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:09:34 crc kubenswrapper[4699]: I1122 04:09:34.446847 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:09:34 crc kubenswrapper[4699]: I1122 04:09:34.446859 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:09:34 crc kubenswrapper[4699]: I1122 04:09:34.446999 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:09:34 crc kubenswrapper[4699]: E1122 04:09:34.447171 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:09:34 crc kubenswrapper[4699]: E1122 04:09:34.447256 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:09:34 crc kubenswrapper[4699]: E1122 04:09:34.447400 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:09:34 crc kubenswrapper[4699]: E1122 04:09:34.552091 4699 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 22 04:09:35 crc kubenswrapper[4699]: I1122 04:09:35.446897 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:09:35 crc kubenswrapper[4699]: E1122 04:09:35.447127 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:09:36 crc kubenswrapper[4699]: I1122 04:09:36.446980 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:09:36 crc kubenswrapper[4699]: I1122 04:09:36.447079 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:09:36 crc kubenswrapper[4699]: E1122 04:09:36.447564 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:09:36 crc kubenswrapper[4699]: I1122 04:09:36.447165 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:09:36 crc kubenswrapper[4699]: E1122 04:09:36.447687 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:09:36 crc kubenswrapper[4699]: E1122 04:09:36.447988 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:09:37 crc kubenswrapper[4699]: I1122 04:09:37.447838 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:09:37 crc kubenswrapper[4699]: E1122 04:09:37.448067 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:09:38 crc kubenswrapper[4699]: I1122 04:09:38.447655 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:09:38 crc kubenswrapper[4699]: I1122 04:09:38.447714 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:09:38 crc kubenswrapper[4699]: E1122 04:09:38.447871 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:09:38 crc kubenswrapper[4699]: E1122 04:09:38.448045 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:09:38 crc kubenswrapper[4699]: I1122 04:09:38.448591 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:09:38 crc kubenswrapper[4699]: E1122 04:09:38.448751 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:09:39 crc kubenswrapper[4699]: I1122 04:09:39.447568 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:09:39 crc kubenswrapper[4699]: E1122 04:09:39.449981 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:09:39 crc kubenswrapper[4699]: E1122 04:09:39.552880 4699 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 22 04:09:40 crc kubenswrapper[4699]: I1122 04:09:40.446876 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:09:40 crc kubenswrapper[4699]: I1122 04:09:40.447052 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:09:40 crc kubenswrapper[4699]: I1122 04:09:40.447052 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:09:40 crc kubenswrapper[4699]: I1122 04:09:40.447312 4699 scope.go:117] "RemoveContainer" containerID="b3db78d8652d86af236e2b210210af39f3c90f31425810390e79391e581d0cf9" Nov 22 04:09:40 crc kubenswrapper[4699]: E1122 04:09:40.447389 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:09:40 crc kubenswrapper[4699]: E1122 04:09:40.447714 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:09:40 crc kubenswrapper[4699]: E1122 04:09:40.447890 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:09:41 crc kubenswrapper[4699]: I1122 04:09:41.299627 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pmtb4_c5f530d5-6f69-4838-a0dd-f4662ddbf85c/kube-multus/1.log" Nov 22 04:09:41 crc kubenswrapper[4699]: I1122 04:09:41.300026 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pmtb4" event={"ID":"c5f530d5-6f69-4838-a0dd-f4662ddbf85c","Type":"ContainerStarted","Data":"ffb362e6b86a26120532d834f084b64ff7f8e82585292b537741c72e7d426e3b"} Nov 22 04:09:41 crc kubenswrapper[4699]: I1122 04:09:41.447920 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:09:41 crc kubenswrapper[4699]: E1122 04:09:41.448107 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:09:42 crc kubenswrapper[4699]: I1122 04:09:42.447712 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:09:42 crc kubenswrapper[4699]: I1122 04:09:42.447780 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:09:42 crc kubenswrapper[4699]: E1122 04:09:42.447932 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:09:42 crc kubenswrapper[4699]: E1122 04:09:42.448054 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:09:42 crc kubenswrapper[4699]: I1122 04:09:42.448512 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:09:42 crc kubenswrapper[4699]: E1122 04:09:42.448599 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:09:43 crc kubenswrapper[4699]: I1122 04:09:43.447660 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:09:43 crc kubenswrapper[4699]: E1122 04:09:43.447848 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:09:44 crc kubenswrapper[4699]: I1122 04:09:44.447061 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:09:44 crc kubenswrapper[4699]: I1122 04:09:44.447258 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:09:44 crc kubenswrapper[4699]: I1122 04:09:44.447360 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:09:44 crc kubenswrapper[4699]: E1122 04:09:44.447484 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:09:44 crc kubenswrapper[4699]: E1122 04:09:44.447706 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:09:44 crc kubenswrapper[4699]: E1122 04:09:44.447877 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:09:44 crc kubenswrapper[4699]: E1122 04:09:44.554630 4699 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 22 04:09:45 crc kubenswrapper[4699]: I1122 04:09:45.447012 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:09:45 crc kubenswrapper[4699]: E1122 04:09:45.447786 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:09:45 crc kubenswrapper[4699]: I1122 04:09:45.448265 4699 scope.go:117] "RemoveContainer" containerID="bbb779ff19249c1428629a088a765868d3740d2e2ebbac18bdd170537da92af0" Nov 22 04:09:46 crc kubenswrapper[4699]: I1122 04:09:46.322053 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z7552_fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3/ovnkube-controller/3.log" Nov 22 04:09:46 crc kubenswrapper[4699]: I1122 04:09:46.325489 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" event={"ID":"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3","Type":"ContainerStarted","Data":"b673fbe125877625a87ad0ea5862032a63a472c737bac93e0e9be1c479112f3d"} Nov 22 04:09:46 crc kubenswrapper[4699]: I1122 04:09:46.326080 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:09:46 crc kubenswrapper[4699]: I1122 04:09:46.359902 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" podStartSLOduration=115.359879997 podStartE2EDuration="1m55.359879997s" podCreationTimestamp="2025-11-22 04:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:09:46.359060347 +0000 UTC m=+137.701681544" watchObservedRunningTime="2025-11-22 04:09:46.359879997 +0000 UTC m=+137.702501194" Nov 22 04:09:46 crc kubenswrapper[4699]: I1122 04:09:46.447662 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:09:46 crc kubenswrapper[4699]: I1122 04:09:46.447712 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:09:46 crc kubenswrapper[4699]: I1122 04:09:46.447786 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:09:46 crc kubenswrapper[4699]: E1122 04:09:46.448556 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:09:46 crc kubenswrapper[4699]: E1122 04:09:46.448727 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:09:46 crc kubenswrapper[4699]: E1122 04:09:46.448669 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:09:47 crc kubenswrapper[4699]: I1122 04:09:47.214728 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-pj52w"] Nov 22 04:09:47 crc kubenswrapper[4699]: I1122 04:09:47.214875 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:09:47 crc kubenswrapper[4699]: E1122 04:09:47.214976 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:09:48 crc kubenswrapper[4699]: I1122 04:09:48.447078 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:09:48 crc kubenswrapper[4699]: I1122 04:09:48.447132 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:09:48 crc kubenswrapper[4699]: E1122 04:09:48.447250 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 04:09:48 crc kubenswrapper[4699]: E1122 04:09:48.447535 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 04:09:48 crc kubenswrapper[4699]: I1122 04:09:48.447664 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:09:48 crc kubenswrapper[4699]: E1122 04:09:48.447718 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 04:09:49 crc kubenswrapper[4699]: I1122 04:09:49.447619 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:09:49 crc kubenswrapper[4699]: E1122 04:09:49.448598 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pj52w" podUID="82be5d0c-6f95-43e4-aa3c-9c56de3e200c" Nov 22 04:09:50 crc kubenswrapper[4699]: I1122 04:09:50.446985 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:09:50 crc kubenswrapper[4699]: I1122 04:09:50.447086 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:09:50 crc kubenswrapper[4699]: I1122 04:09:50.447142 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:09:50 crc kubenswrapper[4699]: I1122 04:09:50.449758 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Nov 22 04:09:50 crc kubenswrapper[4699]: I1122 04:09:50.450181 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Nov 22 04:09:50 crc kubenswrapper[4699]: I1122 04:09:50.450349 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Nov 22 04:09:50 crc kubenswrapper[4699]: I1122 04:09:50.450579 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Nov 22 04:09:51 crc kubenswrapper[4699]: I1122 04:09:51.446913 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:09:51 crc kubenswrapper[4699]: I1122 04:09:51.452348 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Nov 22 04:09:51 crc kubenswrapper[4699]: I1122 04:09:51.452407 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Nov 22 04:09:58 crc kubenswrapper[4699]: I1122 04:09:58.339972 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:09:58 crc kubenswrapper[4699]: E1122 04:09:58.340175 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:12:00.340136324 +0000 UTC m=+271.682757551 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:09:58 crc kubenswrapper[4699]: I1122 04:09:58.340794 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:09:58 crc kubenswrapper[4699]: I1122 04:09:58.340851 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:09:58 crc kubenswrapper[4699]: I1122 04:09:58.342769 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:09:58 crc kubenswrapper[4699]: I1122 04:09:58.350392 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:09:58 crc kubenswrapper[4699]: I1122 04:09:58.442312 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:09:58 crc kubenswrapper[4699]: I1122 04:09:58.442365 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:09:58 crc kubenswrapper[4699]: I1122 04:09:58.449467 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:09:58 crc kubenswrapper[4699]: I1122 04:09:58.450281 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:09:58 crc kubenswrapper[4699]: I1122 04:09:58.564673 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 04:09:58 crc kubenswrapper[4699]: I1122 04:09:58.570817 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:09:58 crc kubenswrapper[4699]: I1122 04:09:58.580538 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 04:09:58 crc kubenswrapper[4699]: W1122 04:09:58.832737 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-fde486652c76f216a6436034f5c09f9f5139e0e1858eacd522730a6a997cb0a6 WatchSource:0}: Error finding container fde486652c76f216a6436034f5c09f9f5139e0e1858eacd522730a6a997cb0a6: Status 404 returned error can't find the container with id fde486652c76f216a6436034f5c09f9f5139e0e1858eacd522730a6a997cb0a6 Nov 22 04:09:59 crc kubenswrapper[4699]: W1122 04:09:59.039180 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-652e8f17eea690420dd9da387600dfd8a83c829211aad24adfdc288590b1d64e WatchSource:0}: Error finding container 652e8f17eea690420dd9da387600dfd8a83c829211aad24adfdc288590b1d64e: Status 404 returned error can't find the container with id 652e8f17eea690420dd9da387600dfd8a83c829211aad24adfdc288590b1d64e Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.370454 4699 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.382074 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"2085efda854b115a12bbab53cfd4e00640bcc9afd0635a8deb91049c29801520"} Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.382127 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"652e8f17eea690420dd9da387600dfd8a83c829211aad24adfdc288590b1d64e"} Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.387544 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"37315a1943ef258e91929cad1aa8e3e65d81c9b9e8a13aecfc4a2d5e314cbd65"} Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.387673 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"08fe62d601f1c169b3a91cfbe9614d1feed70ce6791d97b58bb1005c3ec835f0"} Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.388859 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.391839 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"86522109d02a7d143e336a0266899a54e7334ce81ab3240cb6ee48ef16aaf85f"} Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.391917 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"fde486652c76f216a6436034f5c09f9f5139e0e1858eacd522730a6a997cb0a6"} Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.415255 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rpts"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.415767 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rpts" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.418241 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.418301 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.418889 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.419127 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.419489 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.423841 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.426557 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-mdzbj"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.427116 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9pj89"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.427402 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9pj89" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.427412 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mdzbj" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.428615 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-jlk5m"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.429588 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-jlk5m" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.430575 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-c655t"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.437284 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-shhjp"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.437688 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-c655t" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.454507 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-shhjp" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.457397 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.457653 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.458576 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.458645 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.458704 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.458720 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.458825 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.458886 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.458953 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.458651 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2f1bf82-73fb-4dcc-82e1-7d521ad29241-serving-cert\") pod \"route-controller-manager-6576b87f9c-9rpts\" (UID: \"a2f1bf82-73fb-4dcc-82e1-7d521ad29241\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rpts" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.459085 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.459186 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.459219 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.459324 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.459077 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8mxk\" (UniqueName: \"kubernetes.io/projected/a2f1bf82-73fb-4dcc-82e1-7d521ad29241-kube-api-access-w8mxk\") pod \"route-controller-manager-6576b87f9c-9rpts\" (UID: \"a2f1bf82-73fb-4dcc-82e1-7d521ad29241\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rpts" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.459401 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.459513 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2f1bf82-73fb-4dcc-82e1-7d521ad29241-config\") pod \"route-controller-manager-6576b87f9c-9rpts\" (UID: \"a2f1bf82-73fb-4dcc-82e1-7d521ad29241\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rpts" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.459559 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a2f1bf82-73fb-4dcc-82e1-7d521ad29241-client-ca\") pod \"route-controller-manager-6576b87f9c-9rpts\" (UID: \"a2f1bf82-73fb-4dcc-82e1-7d521ad29241\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rpts" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.459840 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.460043 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.460227 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.467008 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.467181 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.467199 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.467202 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.467359 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.467401 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.468230 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-qc8mt"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.468817 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.468932 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.471607 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dvbjp"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.472211 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dvbjp" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.472238 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-2gxkc"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.472865 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-2gxkc" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.475879 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.476455 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.477626 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.477738 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.477837 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.477942 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.478058 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.478891 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-nl7ht"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.479448 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kqmwr"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.479810 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kqmwr" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.480157 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-nl7ht" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.480620 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.481508 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.482164 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.486941 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-9sbmb"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.487379 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.487497 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-7t7r7"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.487739 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.487900 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b66nb"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.487926 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.488090 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.488243 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b66nb" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.488480 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.488612 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-9sbmb" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.488751 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.488921 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7t7r7" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.489682 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.494749 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v8rsv"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.495329 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v8rsv" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.500367 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.504028 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.504200 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7l8mf"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.504714 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xwbdh"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.504944 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jtp76"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.505264 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jtp76" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.506850 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.507600 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7l8mf" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.508340 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xwbdh" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.509855 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-n4cff"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.510997 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-n4cff" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.512361 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.512701 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.512758 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.512922 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.513010 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.513079 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.513223 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.513269 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.513400 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.513560 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.513640 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-dpsxh"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.533603 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.535825 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.536128 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.536381 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.546204 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.550040 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.566689 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.566980 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.567065 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.568119 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-8jvnn"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.568705 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-8jvnn" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.569845 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.570046 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.570114 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.570244 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.570339 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.570499 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.570625 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.570679 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.570782 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.571686 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.572375 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.572619 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.572700 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.572970 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.575067 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.578780 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.579584 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-dpsxh" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.579989 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dpfg2"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.580823 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.584256 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5nbqq"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.584634 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dpfg2" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.588180 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-lw7n2"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.588505 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-5nbqq" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.589407 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-lw7n2" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.589618 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2f1bf82-73fb-4dcc-82e1-7d521ad29241-serving-cert\") pod \"route-controller-manager-6576b87f9c-9rpts\" (UID: \"a2f1bf82-73fb-4dcc-82e1-7d521ad29241\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rpts" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.589670 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-qc8mt\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.589708 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzwtk\" (UniqueName: \"kubernetes.io/projected/b3173c9e-b8ff-4407-bb12-660219ce7a55-kube-api-access-lzwtk\") pod \"oauth-openshift-558db77b4-qc8mt\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.589729 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-qc8mt\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.589754 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2f1bf82-73fb-4dcc-82e1-7d521ad29241-config\") pod \"route-controller-manager-6576b87f9c-9rpts\" (UID: \"a2f1bf82-73fb-4dcc-82e1-7d521ad29241\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rpts" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.589785 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-qc8mt\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.589817 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b3173c9e-b8ff-4407-bb12-660219ce7a55-audit-dir\") pod \"oauth-openshift-558db77b4-qc8mt\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.589834 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-qc8mt\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.590427 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.590822 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b3173c9e-b8ff-4407-bb12-660219ce7a55-audit-policies\") pod \"oauth-openshift-558db77b4-qc8mt\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.591707 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-qc8mt\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.591459 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2f1bf82-73fb-4dcc-82e1-7d521ad29241-config\") pod \"route-controller-manager-6576b87f9c-9rpts\" (UID: \"a2f1bf82-73fb-4dcc-82e1-7d521ad29241\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rpts" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.591937 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.592006 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.594829 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.592031 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.595117 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2f1bf82-73fb-4dcc-82e1-7d521ad29241-serving-cert\") pod \"route-controller-manager-6576b87f9c-9rpts\" (UID: \"a2f1bf82-73fb-4dcc-82e1-7d521ad29241\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rpts" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.592079 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.592112 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.592244 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.592347 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.594256 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.597285 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.597537 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.597746 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.597787 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.601698 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-fqf9j"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.602236 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-842kh"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.602787 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-fqf9j" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.603169 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.603191 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-8v9k5"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.606940 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-nfp7k"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.596584 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-qc8mt\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.620717 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-qc8mt\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.620877 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-qc8mt\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.621033 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8mxk\" (UniqueName: \"kubernetes.io/projected/a2f1bf82-73fb-4dcc-82e1-7d521ad29241-kube-api-access-w8mxk\") pod \"route-controller-manager-6576b87f9c-9rpts\" (UID: \"a2f1bf82-73fb-4dcc-82e1-7d521ad29241\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rpts" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.621274 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a2f1bf82-73fb-4dcc-82e1-7d521ad29241-client-ca\") pod \"route-controller-manager-6576b87f9c-9rpts\" (UID: \"a2f1bf82-73fb-4dcc-82e1-7d521ad29241\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rpts" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.621407 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-qc8mt\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.621626 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-qc8mt\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.621795 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-qc8mt\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.603619 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.603277 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-842kh" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.624396 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8v9k5" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.605233 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.626889 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a2f1bf82-73fb-4dcc-82e1-7d521ad29241-client-ca\") pod \"route-controller-manager-6576b87f9c-9rpts\" (UID: \"a2f1bf82-73fb-4dcc-82e1-7d521ad29241\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rpts" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.605455 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.639797 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pbzdl"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.640404 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nfp7k" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.643451 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-tnb4t"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.643874 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pbzdl" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.645633 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.651090 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w5jfl"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.645857 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.651510 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fxd9l"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.651882 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-vdc7f"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.652412 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w5jfl" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.652452 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5vkz8"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.652671 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fxd9l" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.652851 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-q4f67"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.653067 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-vdc7f" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.651420 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tnb4t" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.653306 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4dvpp"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.653855 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5vkz8" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.661856 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.662490 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396400-cqt2c"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.662906 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rpts"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.662926 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-mdzbj"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.662938 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-h4sqq"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.663392 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-h4sqq" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.663601 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-cqt2c" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.663609 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4dvpp" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.665715 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9pj89"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.667087 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.670018 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-shhjp"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.670092 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-c655t"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.672448 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-jlk5m"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.672480 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-2gxkc"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.672489 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kqmwr"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.677302 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-tm7js"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.677832 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-tm7js" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.684198 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xwbdh"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.684558 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.688694 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b66nb"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.688765 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dvbjp"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.689616 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-qc8mt"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.691590 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7l8mf"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.693308 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-7t7r7"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.694808 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-dpsxh"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.696479 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w5jfl"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.697569 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-nfp7k"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.698489 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5nbqq"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.699833 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-8jvnn"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.701241 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-lw7n2"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.702830 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dpfg2"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.705787 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pbzdl"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.706802 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396400-cqt2c"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.708058 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-tnb4t"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.709026 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v8rsv"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.710852 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-vdc7f"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.713177 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.713556 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-n4cff"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.714515 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5vkz8"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.716291 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4dvpp"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.716523 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-fqf9j"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.717749 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-9sbmb"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.726513 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fxd9l"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.728015 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jtp76"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.728868 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.732668 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f331d75b-ad6e-4adc-88be-379c031c7d22-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-xwbdh\" (UID: \"f331d75b-ad6e-4adc-88be-379c031c7d22\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xwbdh" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.732719 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a22f23a9-587d-4796-a25c-6563be7a2792-config\") pod \"openshift-apiserver-operator-796bbdcf4f-dvbjp\" (UID: \"a22f23a9-587d-4796-a25c-6563be7a2792\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dvbjp" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.732778 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82pp8\" (UniqueName: \"kubernetes.io/projected/8cba166d-e47a-4e78-ab6b-f5c986cc18f4-kube-api-access-82pp8\") pod \"openshift-controller-manager-operator-756b6f6bc6-pbzdl\" (UID: \"8cba166d-e47a-4e78-ab6b-f5c986cc18f4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pbzdl" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.732853 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/40353ee4-6a92-4e39-be6f-b8249f523e36-metrics-certs\") pod \"router-default-5444994796-842kh\" (UID: \"40353ee4-6a92-4e39-be6f-b8249f523e36\") " pod="openshift-ingress/router-default-5444994796-842kh" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.733008 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-qc8mt\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.733188 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/07ed42bd-25e2-43de-bbd7-431ab818b761-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-jlk5m\" (UID: \"07ed42bd-25e2-43de-bbd7-431ab818b761\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jlk5m" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.733311 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-q4f67"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.733426 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfcdl\" (UniqueName: \"kubernetes.io/projected/40353ee4-6a92-4e39-be6f-b8249f523e36-kube-api-access-gfcdl\") pod \"router-default-5444994796-842kh\" (UID: \"40353ee4-6a92-4e39-be6f-b8249f523e36\") " pod="openshift-ingress/router-default-5444994796-842kh" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.733586 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b3173c9e-b8ff-4407-bb12-660219ce7a55-audit-dir\") pod \"oauth-openshift-558db77b4-qc8mt\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.733988 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcks4\" (UniqueName: \"kubernetes.io/projected/12ba713a-d675-4920-8367-8d6c6ccff834-kube-api-access-hcks4\") pod \"olm-operator-6b444d44fb-jtp76\" (UID: \"12ba713a-d675-4920-8367-8d6c6ccff834\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jtp76" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.734115 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cba166d-e47a-4e78-ab6b-f5c986cc18f4-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-pbzdl\" (UID: \"8cba166d-e47a-4e78-ab6b-f5c986cc18f4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pbzdl" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.734176 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b3173c9e-b8ff-4407-bb12-660219ce7a55-audit-dir\") pod \"oauth-openshift-558db77b4-qc8mt\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.734267 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-qc8mt\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.734594 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c108dbbb-24af-45d9-a01f-cadab889f225-trusted-ca-bundle\") pod \"console-f9d7485db-9sbmb\" (UID: \"c108dbbb-24af-45d9-a01f-cadab889f225\") " pod="openshift-console/console-f9d7485db-9sbmb" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.734691 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/c93ff986-8043-4c13-b1a1-d24305361338-signing-cabundle\") pod \"service-ca-9c57cc56f-5nbqq\" (UID: \"c93ff986-8043-4c13-b1a1-d24305361338\") " pod="openshift-service-ca/service-ca-9c57cc56f-5nbqq" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.734776 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3b82b9e9-6fee-42bd-8cdd-3bacf580f98e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-tnb4t\" (UID: \"3b82b9e9-6fee-42bd-8cdd-3bacf580f98e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tnb4t" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.734852 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94hm7\" (UniqueName: \"kubernetes.io/projected/c108dbbb-24af-45d9-a01f-cadab889f225-kube-api-access-94hm7\") pod \"console-f9d7485db-9sbmb\" (UID: \"c108dbbb-24af-45d9-a01f-cadab889f225\") " pod="openshift-console/console-f9d7485db-9sbmb" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.734908 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/c93ff986-8043-4c13-b1a1-d24305361338-signing-key\") pod \"service-ca-9c57cc56f-5nbqq\" (UID: \"c93ff986-8043-4c13-b1a1-d24305361338\") " pod="openshift-service-ca/service-ca-9c57cc56f-5nbqq" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.735276 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07ed42bd-25e2-43de-bbd7-431ab818b761-config\") pod \"machine-api-operator-5694c8668f-jlk5m\" (UID: \"07ed42bd-25e2-43de-bbd7-431ab818b761\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jlk5m" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.735559 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2901096e-6b5f-4a68-a8e4-fa91ff0575fd-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kqmwr\" (UID: \"2901096e-6b5f-4a68-a8e4-fa91ff0575fd\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kqmwr" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.735610 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/85bf6018-799b-4301-9b19-f68749c028b4-image-import-ca\") pod \"apiserver-76f77b778f-c655t\" (UID: \"85bf6018-799b-4301-9b19-f68749c028b4\") " pod="openshift-apiserver/apiserver-76f77b778f-c655t" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.735660 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmdcv\" (UniqueName: \"kubernetes.io/projected/854975d3-e251-4354-a2c8-84ea7da296a8-kube-api-access-wmdcv\") pod \"downloads-7954f5f757-2gxkc\" (UID: \"854975d3-e251-4354-a2c8-84ea7da296a8\") " pod="openshift-console/downloads-7954f5f757-2gxkc" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.735711 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/12ba713a-d675-4920-8367-8d6c6ccff834-profile-collector-cert\") pod \"olm-operator-6b444d44fb-jtp76\" (UID: \"12ba713a-d675-4920-8367-8d6c6ccff834\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jtp76" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.735846 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wblnl\" (UniqueName: \"kubernetes.io/projected/c93ff986-8043-4c13-b1a1-d24305361338-kube-api-access-wblnl\") pod \"service-ca-9c57cc56f-5nbqq\" (UID: \"c93ff986-8043-4c13-b1a1-d24305361338\") " pod="openshift-service-ca/service-ca-9c57cc56f-5nbqq" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.735939 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae0dbcde-8d79-4c2a-8c0b-6e64c29d52d0-config\") pod \"service-ca-operator-777779d784-h4sqq\" (UID: \"ae0dbcde-8d79-4c2a-8c0b-6e64c29d52d0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-h4sqq" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.735991 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0f099243-7cbe-4d7e-9e5e-062ed53026cb-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-v8rsv\" (UID: \"0f099243-7cbe-4d7e-9e5e-062ed53026cb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v8rsv" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.736031 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgmrz\" (UniqueName: \"kubernetes.io/projected/28796a4d-fde0-4f6e-9a06-8f72bfba6473-kube-api-access-rgmrz\") pod \"controller-manager-879f6c89f-9pj89\" (UID: \"28796a4d-fde0-4f6e-9a06-8f72bfba6473\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9pj89" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.736052 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh2wv\" (UniqueName: \"kubernetes.io/projected/4e4eacc6-98af-4a4d-a161-a8629b46c1ef-kube-api-access-lh2wv\") pod \"migrator-59844c95c7-n4cff\" (UID: \"4e4eacc6-98af-4a4d-a161-a8629b46c1ef\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-n4cff" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.736080 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c108dbbb-24af-45d9-a01f-cadab889f225-console-config\") pod \"console-f9d7485db-9sbmb\" (UID: \"c108dbbb-24af-45d9-a01f-cadab889f225\") " pod="openshift-console/console-f9d7485db-9sbmb" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.736109 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9fa4879e-eded-43e9-816f-151c5f0263cc-etcd-client\") pod \"etcd-operator-b45778765-fqf9j\" (UID: \"9fa4879e-eded-43e9-816f-151c5f0263cc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fqf9j" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.736141 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbfeedc6-bc1b-4ea0-8e36-d97f8529c69f-config\") pod \"kube-apiserver-operator-766d6c64bb-7l8mf\" (UID: \"bbfeedc6-bc1b-4ea0-8e36-d97f8529c69f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7l8mf" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.736177 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b3173c9e-b8ff-4407-bb12-660219ce7a55-audit-policies\") pod \"oauth-openshift-558db77b4-qc8mt\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.736205 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-qc8mt\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.736239 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bptp\" (UniqueName: \"kubernetes.io/projected/86e2f056-f89f-40db-a6e1-336632aa9afe-kube-api-access-8bptp\") pod \"packageserver-d55dfcdfc-b66nb\" (UID: \"86e2f056-f89f-40db-a6e1-336632aa9afe\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b66nb" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.736280 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/98307b62-646c-4668-83a2-d5f741435197-bound-sa-token\") pod \"ingress-operator-5b745b69d9-7t7r7\" (UID: \"98307b62-646c-4668-83a2-d5f741435197\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7t7r7" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.736318 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/85bf6018-799b-4301-9b19-f68749c028b4-audit\") pod \"apiserver-76f77b778f-c655t\" (UID: \"85bf6018-799b-4301-9b19-f68749c028b4\") " pod="openshift-apiserver/apiserver-76f77b778f-c655t" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.736340 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b4be8553-5539-4124-8106-6e65ba593dad-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-8v9k5\" (UID: \"b4be8553-5539-4124-8106-6e65ba593dad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8v9k5" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.736373 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2901096e-6b5f-4a68-a8e4-fa91ff0575fd-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kqmwr\" (UID: \"2901096e-6b5f-4a68-a8e4-fa91ff0575fd\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kqmwr" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.736446 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk7bc\" (UniqueName: \"kubernetes.io/projected/3fca2cfb-d582-4dbb-ab4c-199316fce981-kube-api-access-lk7bc\") pod \"marketplace-operator-79b997595-4dvpp\" (UID: \"3fca2cfb-d582-4dbb-ab4c-199316fce981\") " pod="openshift-marketplace/marketplace-operator-79b997595-4dvpp" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.737632 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9fa4879e-eded-43e9-816f-151c5f0263cc-serving-cert\") pod \"etcd-operator-b45778765-fqf9j\" (UID: \"9fa4879e-eded-43e9-816f-151c5f0263cc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fqf9j" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.737772 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-qc8mt\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.737827 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/98307b62-646c-4668-83a2-d5f741435197-trusted-ca\") pod \"ingress-operator-5b745b69d9-7t7r7\" (UID: \"98307b62-646c-4668-83a2-d5f741435197\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7t7r7" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.737876 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz6kd\" (UniqueName: \"kubernetes.io/projected/b4be8553-5539-4124-8106-6e65ba593dad-kube-api-access-rz6kd\") pod \"apiserver-7bbb656c7d-8v9k5\" (UID: \"b4be8553-5539-4124-8106-6e65ba593dad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8v9k5" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.737932 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b3173c9e-b8ff-4407-bb12-660219ce7a55-audit-policies\") pod \"oauth-openshift-558db77b4-qc8mt\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.737940 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/85bf6018-799b-4301-9b19-f68749c028b4-node-pullsecrets\") pod \"apiserver-76f77b778f-c655t\" (UID: \"85bf6018-799b-4301-9b19-f68749c028b4\") " pod="openshift-apiserver/apiserver-76f77b778f-c655t" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.737917 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-qc8mt\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.738034 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2901096e-6b5f-4a68-a8e4-fa91ff0575fd-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kqmwr\" (UID: \"2901096e-6b5f-4a68-a8e4-fa91ff0575fd\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kqmwr" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.738079 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c108dbbb-24af-45d9-a01f-cadab889f225-service-ca\") pod \"console-f9d7485db-9sbmb\" (UID: \"c108dbbb-24af-45d9-a01f-cadab889f225\") " pod="openshift-console/console-f9d7485db-9sbmb" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.738119 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28796a4d-fde0-4f6e-9a06-8f72bfba6473-serving-cert\") pod \"controller-manager-879f6c89f-9pj89\" (UID: \"28796a4d-fde0-4f6e-9a06-8f72bfba6473\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9pj89" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.738212 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-qc8mt\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.738298 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/07ed42bd-25e2-43de-bbd7-431ab818b761-images\") pod \"machine-api-operator-5694c8668f-jlk5m\" (UID: \"07ed42bd-25e2-43de-bbd7-431ab818b761\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jlk5m" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.738401 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcfpq\" (UniqueName: \"kubernetes.io/projected/0a91cd3c-7e69-4122-b256-f2fb9a6e0f0d-kube-api-access-kcfpq\") pod \"authentication-operator-69f744f599-dpsxh\" (UID: \"0a91cd3c-7e69-4122-b256-f2fb9a6e0f0d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dpsxh" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.738539 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cba166d-e47a-4e78-ab6b-f5c986cc18f4-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-pbzdl\" (UID: \"8cba166d-e47a-4e78-ab6b-f5c986cc18f4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pbzdl" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.738606 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85bf6018-799b-4301-9b19-f68749c028b4-config\") pod \"apiserver-76f77b778f-c655t\" (UID: \"85bf6018-799b-4301-9b19-f68749c028b4\") " pod="openshift-apiserver/apiserver-76f77b778f-c655t" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.738784 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/85bf6018-799b-4301-9b19-f68749c028b4-encryption-config\") pod \"apiserver-76f77b778f-c655t\" (UID: \"85bf6018-799b-4301-9b19-f68749c028b4\") " pod="openshift-apiserver/apiserver-76f77b778f-c655t" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.738927 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/bfa00eae-58d2-4c7f-b232-16d1edff80f0-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-dpfg2\" (UID: \"bfa00eae-58d2-4c7f-b232-16d1edff80f0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dpfg2" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.739009 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85shx\" (UniqueName: \"kubernetes.io/projected/88c8894c-2a5e-4a5e-b3a2-83f266a23143-kube-api-access-85shx\") pod \"cluster-samples-operator-665b6dd947-shhjp\" (UID: \"88c8894c-2a5e-4a5e-b3a2-83f266a23143\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-shhjp" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.739089 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fa4879e-eded-43e9-816f-151c5f0263cc-config\") pod \"etcd-operator-b45778765-fqf9j\" (UID: \"9fa4879e-eded-43e9-816f-151c5f0263cc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fqf9j" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.739008 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-qc8mt\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.739161 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85bf6018-799b-4301-9b19-f68749c028b4-trusted-ca-bundle\") pod \"apiserver-76f77b778f-c655t\" (UID: \"85bf6018-799b-4301-9b19-f68749c028b4\") " pod="openshift-apiserver/apiserver-76f77b778f-c655t" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.739248 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/86e2f056-f89f-40db-a6e1-336632aa9afe-webhook-cert\") pod \"packageserver-d55dfcdfc-b66nb\" (UID: \"86e2f056-f89f-40db-a6e1-336632aa9afe\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b66nb" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.739285 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c108dbbb-24af-45d9-a01f-cadab889f225-console-serving-cert\") pod \"console-f9d7485db-9sbmb\" (UID: \"c108dbbb-24af-45d9-a01f-cadab889f225\") " pod="openshift-console/console-f9d7485db-9sbmb" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.739318 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9fa4879e-eded-43e9-816f-151c5f0263cc-etcd-ca\") pod \"etcd-operator-b45778765-fqf9j\" (UID: \"9fa4879e-eded-43e9-816f-151c5f0263cc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fqf9j" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.739357 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/98307b62-646c-4668-83a2-d5f741435197-metrics-tls\") pod \"ingress-operator-5b745b69d9-7t7r7\" (UID: \"98307b62-646c-4668-83a2-d5f741435197\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7t7r7" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.739389 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9fca12a-5420-4c0f-90fc-05333ca3353f-config\") pod \"machine-approver-56656f9798-nl7ht\" (UID: \"d9fca12a-5420-4c0f-90fc-05333ca3353f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-nl7ht" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.739421 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddhbk\" (UniqueName: \"kubernetes.io/projected/bfa00eae-58d2-4c7f-b232-16d1edff80f0-kube-api-access-ddhbk\") pod \"package-server-manager-789f6589d5-dpfg2\" (UID: \"bfa00eae-58d2-4c7f-b232-16d1edff80f0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dpfg2" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.739559 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40353ee4-6a92-4e39-be6f-b8249f523e36-service-ca-bundle\") pod \"router-default-5444994796-842kh\" (UID: \"40353ee4-6a92-4e39-be6f-b8249f523e36\") " pod="openshift-ingress/router-default-5444994796-842kh" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.739660 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/85bf6018-799b-4301-9b19-f68749c028b4-etcd-serving-ca\") pod \"apiserver-76f77b778f-c655t\" (UID: \"85bf6018-799b-4301-9b19-f68749c028b4\") " pod="openshift-apiserver/apiserver-76f77b778f-c655t" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.739705 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/08e24326-1ea6-4ab6-bcac-c58f25a88358-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-vdc7f\" (UID: \"08e24326-1ea6-4ab6-bcac-c58f25a88358\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vdc7f" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.739740 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e597a0aa-8325-4b8b-8691-c6b2a55bc714-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-nfp7k\" (UID: \"e597a0aa-8325-4b8b-8691-c6b2a55bc714\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nfp7k" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.739867 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-qc8mt\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.739935 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28796a4d-fde0-4f6e-9a06-8f72bfba6473-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9pj89\" (UID: \"28796a4d-fde0-4f6e-9a06-8f72bfba6473\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9pj89" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.739986 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d999d\" (UniqueName: \"kubernetes.io/projected/f331d75b-ad6e-4adc-88be-379c031c7d22-kube-api-access-d999d\") pod \"cluster-image-registry-operator-dc59b4c8b-xwbdh\" (UID: \"f331d75b-ad6e-4adc-88be-379c031c7d22\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xwbdh" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.740201 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-qc8mt\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.740241 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/85bf6018-799b-4301-9b19-f68749c028b4-etcd-client\") pod \"apiserver-76f77b778f-c655t\" (UID: \"85bf6018-799b-4301-9b19-f68749c028b4\") " pod="openshift-apiserver/apiserver-76f77b778f-c655t" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.740319 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-qc8mt\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.740349 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f099243-7cbe-4d7e-9e5e-062ed53026cb-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-v8rsv\" (UID: \"0f099243-7cbe-4d7e-9e5e-062ed53026cb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v8rsv" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.740389 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/71c06d34-b31d-47d7-8323-510c5716530e-profile-collector-cert\") pod \"catalog-operator-68c6474976-5vkz8\" (UID: \"71c06d34-b31d-47d7-8323-510c5716530e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5vkz8" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.740542 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgtlf\" (UniqueName: \"kubernetes.io/projected/71c06d34-b31d-47d7-8323-510c5716530e-kube-api-access-hgtlf\") pod \"catalog-operator-68c6474976-5vkz8\" (UID: \"71c06d34-b31d-47d7-8323-510c5716530e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5vkz8" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.740586 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3fca2cfb-d582-4dbb-ab4c-199316fce981-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4dvpp\" (UID: \"3fca2cfb-d582-4dbb-ab4c-199316fce981\") " pod="openshift-marketplace/marketplace-operator-79b997595-4dvpp" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.740613 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4be8553-5539-4124-8106-6e65ba593dad-serving-cert\") pod \"apiserver-7bbb656c7d-8v9k5\" (UID: \"b4be8553-5539-4124-8106-6e65ba593dad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8v9k5" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.740715 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f099243-7cbe-4d7e-9e5e-062ed53026cb-config\") pod \"kube-controller-manager-operator-78b949d7b-v8rsv\" (UID: \"0f099243-7cbe-4d7e-9e5e-062ed53026cb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v8rsv" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.740833 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f331d75b-ad6e-4adc-88be-379c031c7d22-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-xwbdh\" (UID: \"f331d75b-ad6e-4adc-88be-379c031c7d22\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xwbdh" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.740881 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vdnv\" (UniqueName: \"kubernetes.io/projected/d9fca12a-5420-4c0f-90fc-05333ca3353f-kube-api-access-4vdnv\") pod \"machine-approver-56656f9798-nl7ht\" (UID: \"d9fca12a-5420-4c0f-90fc-05333ca3353f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-nl7ht" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.740959 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b4be8553-5539-4124-8106-6e65ba593dad-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-8v9k5\" (UID: \"b4be8553-5539-4124-8106-6e65ba593dad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8v9k5" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.741034 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/40353ee4-6a92-4e39-be6f-b8249f523e36-stats-auth\") pod \"router-default-5444994796-842kh\" (UID: \"40353ee4-6a92-4e39-be6f-b8249f523e36\") " pod="openshift-ingress/router-default-5444994796-842kh" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.741230 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bbfeedc6-bc1b-4ea0-8e36-d97f8529c69f-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-7l8mf\" (UID: \"bbfeedc6-bc1b-4ea0-8e36-d97f8529c69f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7l8mf" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.741354 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85bf6018-799b-4301-9b19-f68749c028b4-serving-cert\") pod \"apiserver-76f77b778f-c655t\" (UID: \"85bf6018-799b-4301-9b19-f68749c028b4\") " pod="openshift-apiserver/apiserver-76f77b778f-c655t" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.741406 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqlkt\" (UniqueName: \"kubernetes.io/projected/44f5f1a9-edac-427c-b170-affcaa869772-kube-api-access-pqlkt\") pod \"openshift-config-operator-7777fb866f-mdzbj\" (UID: \"44f5f1a9-edac-427c-b170-affcaa869772\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mdzbj" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.741667 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76zfr\" (UniqueName: \"kubernetes.io/projected/08e24326-1ea6-4ab6-bcac-c58f25a88358-kube-api-access-76zfr\") pod \"multus-admission-controller-857f4d67dd-vdc7f\" (UID: \"08e24326-1ea6-4ab6-bcac-c58f25a88358\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vdc7f" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.741767 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/12ba713a-d675-4920-8367-8d6c6ccff834-srv-cert\") pod \"olm-operator-6b444d44fb-jtp76\" (UID: \"12ba713a-d675-4920-8367-8d6c6ccff834\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jtp76" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.741835 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsd6c\" (UniqueName: \"kubernetes.io/projected/ae0dbcde-8d79-4c2a-8c0b-6e64c29d52d0-kube-api-access-wsd6c\") pod \"service-ca-operator-777779d784-h4sqq\" (UID: \"ae0dbcde-8d79-4c2a-8c0b-6e64c29d52d0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-h4sqq" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.741873 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c108dbbb-24af-45d9-a01f-cadab889f225-oauth-serving-cert\") pod \"console-f9d7485db-9sbmb\" (UID: \"c108dbbb-24af-45d9-a01f-cadab889f225\") " pod="openshift-console/console-f9d7485db-9sbmb" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.741894 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9fa4879e-eded-43e9-816f-151c5f0263cc-etcd-service-ca\") pod \"etcd-operator-b45778765-fqf9j\" (UID: \"9fa4879e-eded-43e9-816f-151c5f0263cc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fqf9j" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.741925 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3fca2cfb-d582-4dbb-ab4c-199316fce981-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4dvpp\" (UID: \"3fca2cfb-d582-4dbb-ab4c-199316fce981\") " pod="openshift-marketplace/marketplace-operator-79b997595-4dvpp" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.742060 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a91cd3c-7e69-4122-b256-f2fb9a6e0f0d-serving-cert\") pod \"authentication-operator-69f744f599-dpsxh\" (UID: \"0a91cd3c-7e69-4122-b256-f2fb9a6e0f0d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dpsxh" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.742099 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b4be8553-5539-4124-8106-6e65ba593dad-audit-policies\") pod \"apiserver-7bbb656c7d-8v9k5\" (UID: \"b4be8553-5539-4124-8106-6e65ba593dad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8v9k5" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.742164 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-qc8mt\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.742236 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3b82b9e9-6fee-42bd-8cdd-3bacf580f98e-images\") pod \"machine-config-operator-74547568cd-tnb4t\" (UID: \"3b82b9e9-6fee-42bd-8cdd-3bacf580f98e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tnb4t" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.742342 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-qc8mt\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.742394 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/44f5f1a9-edac-427c-b170-affcaa869772-available-featuregates\") pod \"openshift-config-operator-7777fb866f-mdzbj\" (UID: \"44f5f1a9-edac-427c-b170-affcaa869772\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mdzbj" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.742518 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-qc8mt\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.742578 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wknc\" (UniqueName: \"kubernetes.io/projected/98307b62-646c-4668-83a2-d5f741435197-kube-api-access-6wknc\") pod \"ingress-operator-5b745b69d9-7t7r7\" (UID: \"98307b62-646c-4668-83a2-d5f741435197\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7t7r7" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.742624 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a22f23a9-587d-4796-a25c-6563be7a2792-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-dvbjp\" (UID: \"a22f23a9-587d-4796-a25c-6563be7a2792\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dvbjp" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.742775 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw449\" (UniqueName: \"kubernetes.io/projected/3b82b9e9-6fee-42bd-8cdd-3bacf580f98e-kube-api-access-jw449\") pod \"machine-config-operator-74547568cd-tnb4t\" (UID: \"3b82b9e9-6fee-42bd-8cdd-3bacf580f98e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tnb4t" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.743034 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d9fca12a-5420-4c0f-90fc-05333ca3353f-auth-proxy-config\") pod \"machine-approver-56656f9798-nl7ht\" (UID: \"d9fca12a-5420-4c0f-90fc-05333ca3353f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-nl7ht" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.743105 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d9fca12a-5420-4c0f-90fc-05333ca3353f-machine-approver-tls\") pod \"machine-approver-56656f9798-nl7ht\" (UID: \"d9fca12a-5420-4c0f-90fc-05333ca3353f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-nl7ht" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.743744 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a91cd3c-7e69-4122-b256-f2fb9a6e0f0d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-dpsxh\" (UID: \"0a91cd3c-7e69-4122-b256-f2fb9a6e0f0d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dpsxh" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.743799 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a91cd3c-7e69-4122-b256-f2fb9a6e0f0d-service-ca-bundle\") pod \"authentication-operator-69f744f599-dpsxh\" (UID: \"0a91cd3c-7e69-4122-b256-f2fb9a6e0f0d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dpsxh" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.743891 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbfeedc6-bc1b-4ea0-8e36-d97f8529c69f-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-7l8mf\" (UID: \"bbfeedc6-bc1b-4ea0-8e36-d97f8529c69f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7l8mf" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.743937 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3b82b9e9-6fee-42bd-8cdd-3bacf580f98e-proxy-tls\") pod \"machine-config-operator-74547568cd-tnb4t\" (UID: \"3b82b9e9-6fee-42bd-8cdd-3bacf580f98e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tnb4t" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.743981 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzxbk\" (UniqueName: \"kubernetes.io/projected/85bf6018-799b-4301-9b19-f68749c028b4-kube-api-access-rzxbk\") pod \"apiserver-76f77b778f-c655t\" (UID: \"85bf6018-799b-4301-9b19-f68749c028b4\") " pod="openshift-apiserver/apiserver-76f77b778f-c655t" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.744059 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b4be8553-5539-4124-8106-6e65ba593dad-etcd-client\") pod \"apiserver-7bbb656c7d-8v9k5\" (UID: \"b4be8553-5539-4124-8106-6e65ba593dad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8v9k5" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.744158 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/86e2f056-f89f-40db-a6e1-336632aa9afe-apiservice-cert\") pod \"packageserver-d55dfcdfc-b66nb\" (UID: \"86e2f056-f89f-40db-a6e1-336632aa9afe\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b66nb" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.744216 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-qc8mt\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.744260 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/71c06d34-b31d-47d7-8323-510c5716530e-srv-cert\") pod \"catalog-operator-68c6474976-5vkz8\" (UID: \"71c06d34-b31d-47d7-8323-510c5716530e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5vkz8" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.744666 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28796a4d-fde0-4f6e-9a06-8f72bfba6473-client-ca\") pod \"controller-manager-879f6c89f-9pj89\" (UID: \"28796a4d-fde0-4f6e-9a06-8f72bfba6473\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9pj89" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.744839 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpkqs\" (UniqueName: \"kubernetes.io/projected/a22f23a9-587d-4796-a25c-6563be7a2792-kube-api-access-bpkqs\") pod \"openshift-apiserver-operator-796bbdcf4f-dvbjp\" (UID: \"a22f23a9-587d-4796-a25c-6563be7a2792\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dvbjp" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.744883 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/85bf6018-799b-4301-9b19-f68749c028b4-audit-dir\") pod \"apiserver-76f77b778f-c655t\" (UID: \"85bf6018-799b-4301-9b19-f68749c028b4\") " pod="openshift-apiserver/apiserver-76f77b778f-c655t" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.744912 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c108dbbb-24af-45d9-a01f-cadab889f225-console-oauth-config\") pod \"console-f9d7485db-9sbmb\" (UID: \"c108dbbb-24af-45d9-a01f-cadab889f225\") " pod="openshift-console/console-f9d7485db-9sbmb" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.744934 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a91cd3c-7e69-4122-b256-f2fb9a6e0f0d-config\") pod \"authentication-operator-69f744f599-dpsxh\" (UID: \"0a91cd3c-7e69-4122-b256-f2fb9a6e0f0d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dpsxh" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.744960 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae0dbcde-8d79-4c2a-8c0b-6e64c29d52d0-serving-cert\") pod \"service-ca-operator-777779d784-h4sqq\" (UID: \"ae0dbcde-8d79-4c2a-8c0b-6e64c29d52d0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-h4sqq" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.744999 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq7vx\" (UniqueName: \"kubernetes.io/projected/9fa4879e-eded-43e9-816f-151c5f0263cc-kube-api-access-hq7vx\") pod \"etcd-operator-b45778765-fqf9j\" (UID: \"9fa4879e-eded-43e9-816f-151c5f0263cc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fqf9j" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.745085 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvsx6\" (UniqueName: \"kubernetes.io/projected/07ed42bd-25e2-43de-bbd7-431ab818b761-kube-api-access-fvsx6\") pod \"machine-api-operator-5694c8668f-jlk5m\" (UID: \"07ed42bd-25e2-43de-bbd7-431ab818b761\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jlk5m" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.745094 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-qc8mt\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.745118 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44f5f1a9-edac-427c-b170-affcaa869772-serving-cert\") pod \"openshift-config-operator-7777fb866f-mdzbj\" (UID: \"44f5f1a9-edac-427c-b170-affcaa869772\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mdzbj" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.745147 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9dq4\" (UniqueName: \"kubernetes.io/projected/e597a0aa-8325-4b8b-8691-c6b2a55bc714-kube-api-access-l9dq4\") pod \"machine-config-controller-84d6567774-nfp7k\" (UID: \"e597a0aa-8325-4b8b-8691-c6b2a55bc714\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nfp7k" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.745180 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b4be8553-5539-4124-8106-6e65ba593dad-encryption-config\") pod \"apiserver-7bbb656c7d-8v9k5\" (UID: \"b4be8553-5539-4124-8106-6e65ba593dad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8v9k5" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.745208 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/88c8894c-2a5e-4a5e-b3a2-83f266a23143-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-shhjp\" (UID: \"88c8894c-2a5e-4a5e-b3a2-83f266a23143\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-shhjp" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.745248 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/86e2f056-f89f-40db-a6e1-336632aa9afe-tmpfs\") pod \"packageserver-d55dfcdfc-b66nb\" (UID: \"86e2f056-f89f-40db-a6e1-336632aa9afe\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b66nb" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.745274 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/40353ee4-6a92-4e39-be6f-b8249f523e36-default-certificate\") pod \"router-default-5444994796-842kh\" (UID: \"40353ee4-6a92-4e39-be6f-b8249f523e36\") " pod="openshift-ingress/router-default-5444994796-842kh" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.745452 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f331d75b-ad6e-4adc-88be-379c031c7d22-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-xwbdh\" (UID: \"f331d75b-ad6e-4adc-88be-379c031c7d22\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xwbdh" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.745480 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b4be8553-5539-4124-8106-6e65ba593dad-audit-dir\") pod \"apiserver-7bbb656c7d-8v9k5\" (UID: \"b4be8553-5539-4124-8106-6e65ba593dad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8v9k5" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.745599 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-qc8mt\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.745657 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28796a4d-fde0-4f6e-9a06-8f72bfba6473-config\") pod \"controller-manager-879f6c89f-9pj89\" (UID: \"28796a4d-fde0-4f6e-9a06-8f72bfba6473\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9pj89" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.745712 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzwtk\" (UniqueName: \"kubernetes.io/projected/b3173c9e-b8ff-4407-bb12-660219ce7a55-kube-api-access-lzwtk\") pod \"oauth-openshift-558db77b4-qc8mt\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.745754 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e597a0aa-8325-4b8b-8691-c6b2a55bc714-proxy-tls\") pod \"machine-config-controller-84d6567774-nfp7k\" (UID: \"e597a0aa-8325-4b8b-8691-c6b2a55bc714\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nfp7k" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.745781 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.746045 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-8v9k5"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.746954 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-qc8mt\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.747445 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-qc8mt\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.748188 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-qc8mt\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.748922 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-qc8mt\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.749074 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-gmb69"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.750774 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-qc8mt\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.753958 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-qc8mt\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.755072 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-4s69k"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.755280 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-gmb69" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.755858 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-h4sqq"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.755959 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4s69k" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.757117 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4s69k"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.758338 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-gmb69"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.759390 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-x4855"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.760518 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-x4855" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.760609 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-x4855"] Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.765452 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.785634 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.804993 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.824980 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.846943 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e597a0aa-8325-4b8b-8691-c6b2a55bc714-proxy-tls\") pod \"machine-config-controller-84d6567774-nfp7k\" (UID: \"e597a0aa-8325-4b8b-8691-c6b2a55bc714\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nfp7k" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.846983 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f331d75b-ad6e-4adc-88be-379c031c7d22-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-xwbdh\" (UID: \"f331d75b-ad6e-4adc-88be-379c031c7d22\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xwbdh" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.847003 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a22f23a9-587d-4796-a25c-6563be7a2792-config\") pod \"openshift-apiserver-operator-796bbdcf4f-dvbjp\" (UID: \"a22f23a9-587d-4796-a25c-6563be7a2792\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dvbjp" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.847028 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82pp8\" (UniqueName: \"kubernetes.io/projected/8cba166d-e47a-4e78-ab6b-f5c986cc18f4-kube-api-access-82pp8\") pod \"openshift-controller-manager-operator-756b6f6bc6-pbzdl\" (UID: \"8cba166d-e47a-4e78-ab6b-f5c986cc18f4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pbzdl" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.847049 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/40353ee4-6a92-4e39-be6f-b8249f523e36-metrics-certs\") pod \"router-default-5444994796-842kh\" (UID: \"40353ee4-6a92-4e39-be6f-b8249f523e36\") " pod="openshift-ingress/router-default-5444994796-842kh" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.847092 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfcdl\" (UniqueName: \"kubernetes.io/projected/40353ee4-6a92-4e39-be6f-b8249f523e36-kube-api-access-gfcdl\") pod \"router-default-5444994796-842kh\" (UID: \"40353ee4-6a92-4e39-be6f-b8249f523e36\") " pod="openshift-ingress/router-default-5444994796-842kh" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.847135 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/07ed42bd-25e2-43de-bbd7-431ab818b761-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-jlk5m\" (UID: \"07ed42bd-25e2-43de-bbd7-431ab818b761\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jlk5m" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.847157 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcks4\" (UniqueName: \"kubernetes.io/projected/12ba713a-d675-4920-8367-8d6c6ccff834-kube-api-access-hcks4\") pod \"olm-operator-6b444d44fb-jtp76\" (UID: \"12ba713a-d675-4920-8367-8d6c6ccff834\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jtp76" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.847177 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cba166d-e47a-4e78-ab6b-f5c986cc18f4-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-pbzdl\" (UID: \"8cba166d-e47a-4e78-ab6b-f5c986cc18f4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pbzdl" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.847217 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c108dbbb-24af-45d9-a01f-cadab889f225-trusted-ca-bundle\") pod \"console-f9d7485db-9sbmb\" (UID: \"c108dbbb-24af-45d9-a01f-cadab889f225\") " pod="openshift-console/console-f9d7485db-9sbmb" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.847237 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/c93ff986-8043-4c13-b1a1-d24305361338-signing-cabundle\") pod \"service-ca-9c57cc56f-5nbqq\" (UID: \"c93ff986-8043-4c13-b1a1-d24305361338\") " pod="openshift-service-ca/service-ca-9c57cc56f-5nbqq" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.847257 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3b82b9e9-6fee-42bd-8cdd-3bacf580f98e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-tnb4t\" (UID: \"3b82b9e9-6fee-42bd-8cdd-3bacf580f98e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tnb4t" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.847278 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94hm7\" (UniqueName: \"kubernetes.io/projected/c108dbbb-24af-45d9-a01f-cadab889f225-kube-api-access-94hm7\") pod \"console-f9d7485db-9sbmb\" (UID: \"c108dbbb-24af-45d9-a01f-cadab889f225\") " pod="openshift-console/console-f9d7485db-9sbmb" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.847294 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/c93ff986-8043-4c13-b1a1-d24305361338-signing-key\") pod \"service-ca-9c57cc56f-5nbqq\" (UID: \"c93ff986-8043-4c13-b1a1-d24305361338\") " pod="openshift-service-ca/service-ca-9c57cc56f-5nbqq" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.847313 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2901096e-6b5f-4a68-a8e4-fa91ff0575fd-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kqmwr\" (UID: \"2901096e-6b5f-4a68-a8e4-fa91ff0575fd\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kqmwr" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.847332 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/85bf6018-799b-4301-9b19-f68749c028b4-image-import-ca\") pod \"apiserver-76f77b778f-c655t\" (UID: \"85bf6018-799b-4301-9b19-f68749c028b4\") " pod="openshift-apiserver/apiserver-76f77b778f-c655t" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.847349 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmdcv\" (UniqueName: \"kubernetes.io/projected/854975d3-e251-4354-a2c8-84ea7da296a8-kube-api-access-wmdcv\") pod \"downloads-7954f5f757-2gxkc\" (UID: \"854975d3-e251-4354-a2c8-84ea7da296a8\") " pod="openshift-console/downloads-7954f5f757-2gxkc" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.847371 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07ed42bd-25e2-43de-bbd7-431ab818b761-config\") pod \"machine-api-operator-5694c8668f-jlk5m\" (UID: \"07ed42bd-25e2-43de-bbd7-431ab818b761\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jlk5m" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.847387 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/12ba713a-d675-4920-8367-8d6c6ccff834-profile-collector-cert\") pod \"olm-operator-6b444d44fb-jtp76\" (UID: \"12ba713a-d675-4920-8367-8d6c6ccff834\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jtp76" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.847407 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wblnl\" (UniqueName: \"kubernetes.io/projected/c93ff986-8043-4c13-b1a1-d24305361338-kube-api-access-wblnl\") pod \"service-ca-9c57cc56f-5nbqq\" (UID: \"c93ff986-8043-4c13-b1a1-d24305361338\") " pod="openshift-service-ca/service-ca-9c57cc56f-5nbqq" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.847425 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae0dbcde-8d79-4c2a-8c0b-6e64c29d52d0-config\") pod \"service-ca-operator-777779d784-h4sqq\" (UID: \"ae0dbcde-8d79-4c2a-8c0b-6e64c29d52d0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-h4sqq" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.847460 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0f099243-7cbe-4d7e-9e5e-062ed53026cb-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-v8rsv\" (UID: \"0f099243-7cbe-4d7e-9e5e-062ed53026cb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v8rsv" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.847479 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgmrz\" (UniqueName: \"kubernetes.io/projected/28796a4d-fde0-4f6e-9a06-8f72bfba6473-kube-api-access-rgmrz\") pod \"controller-manager-879f6c89f-9pj89\" (UID: \"28796a4d-fde0-4f6e-9a06-8f72bfba6473\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9pj89" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.847495 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lh2wv\" (UniqueName: \"kubernetes.io/projected/4e4eacc6-98af-4a4d-a161-a8629b46c1ef-kube-api-access-lh2wv\") pod \"migrator-59844c95c7-n4cff\" (UID: \"4e4eacc6-98af-4a4d-a161-a8629b46c1ef\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-n4cff" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.847512 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c108dbbb-24af-45d9-a01f-cadab889f225-console-config\") pod \"console-f9d7485db-9sbmb\" (UID: \"c108dbbb-24af-45d9-a01f-cadab889f225\") " pod="openshift-console/console-f9d7485db-9sbmb" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.847552 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9fa4879e-eded-43e9-816f-151c5f0263cc-etcd-client\") pod \"etcd-operator-b45778765-fqf9j\" (UID: \"9fa4879e-eded-43e9-816f-151c5f0263cc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fqf9j" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.847573 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbfeedc6-bc1b-4ea0-8e36-d97f8529c69f-config\") pod \"kube-apiserver-operator-766d6c64bb-7l8mf\" (UID: \"bbfeedc6-bc1b-4ea0-8e36-d97f8529c69f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7l8mf" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.847596 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bptp\" (UniqueName: \"kubernetes.io/projected/86e2f056-f89f-40db-a6e1-336632aa9afe-kube-api-access-8bptp\") pod \"packageserver-d55dfcdfc-b66nb\" (UID: \"86e2f056-f89f-40db-a6e1-336632aa9afe\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b66nb" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.847615 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/98307b62-646c-4668-83a2-d5f741435197-bound-sa-token\") pod \"ingress-operator-5b745b69d9-7t7r7\" (UID: \"98307b62-646c-4668-83a2-d5f741435197\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7t7r7" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.847631 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/85bf6018-799b-4301-9b19-f68749c028b4-audit\") pod \"apiserver-76f77b778f-c655t\" (UID: \"85bf6018-799b-4301-9b19-f68749c028b4\") " pod="openshift-apiserver/apiserver-76f77b778f-c655t" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.847651 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b4be8553-5539-4124-8106-6e65ba593dad-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-8v9k5\" (UID: \"b4be8553-5539-4124-8106-6e65ba593dad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8v9k5" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.847668 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2901096e-6b5f-4a68-a8e4-fa91ff0575fd-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kqmwr\" (UID: \"2901096e-6b5f-4a68-a8e4-fa91ff0575fd\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kqmwr" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.847689 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk7bc\" (UniqueName: \"kubernetes.io/projected/3fca2cfb-d582-4dbb-ab4c-199316fce981-kube-api-access-lk7bc\") pod \"marketplace-operator-79b997595-4dvpp\" (UID: \"3fca2cfb-d582-4dbb-ab4c-199316fce981\") " pod="openshift-marketplace/marketplace-operator-79b997595-4dvpp" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.847705 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9fa4879e-eded-43e9-816f-151c5f0263cc-serving-cert\") pod \"etcd-operator-b45778765-fqf9j\" (UID: \"9fa4879e-eded-43e9-816f-151c5f0263cc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fqf9j" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.847724 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/98307b62-646c-4668-83a2-d5f741435197-trusted-ca\") pod \"ingress-operator-5b745b69d9-7t7r7\" (UID: \"98307b62-646c-4668-83a2-d5f741435197\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7t7r7" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.847742 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rz6kd\" (UniqueName: \"kubernetes.io/projected/b4be8553-5539-4124-8106-6e65ba593dad-kube-api-access-rz6kd\") pod \"apiserver-7bbb656c7d-8v9k5\" (UID: \"b4be8553-5539-4124-8106-6e65ba593dad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8v9k5" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.847765 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/85bf6018-799b-4301-9b19-f68749c028b4-node-pullsecrets\") pod \"apiserver-76f77b778f-c655t\" (UID: \"85bf6018-799b-4301-9b19-f68749c028b4\") " pod="openshift-apiserver/apiserver-76f77b778f-c655t" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.847802 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2901096e-6b5f-4a68-a8e4-fa91ff0575fd-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kqmwr\" (UID: \"2901096e-6b5f-4a68-a8e4-fa91ff0575fd\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kqmwr" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.847820 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c108dbbb-24af-45d9-a01f-cadab889f225-service-ca\") pod \"console-f9d7485db-9sbmb\" (UID: \"c108dbbb-24af-45d9-a01f-cadab889f225\") " pod="openshift-console/console-f9d7485db-9sbmb" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.847836 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28796a4d-fde0-4f6e-9a06-8f72bfba6473-serving-cert\") pod \"controller-manager-879f6c89f-9pj89\" (UID: \"28796a4d-fde0-4f6e-9a06-8f72bfba6473\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9pj89" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.847852 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/07ed42bd-25e2-43de-bbd7-431ab818b761-images\") pod \"machine-api-operator-5694c8668f-jlk5m\" (UID: \"07ed42bd-25e2-43de-bbd7-431ab818b761\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jlk5m" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.847873 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcfpq\" (UniqueName: \"kubernetes.io/projected/0a91cd3c-7e69-4122-b256-f2fb9a6e0f0d-kube-api-access-kcfpq\") pod \"authentication-operator-69f744f599-dpsxh\" (UID: \"0a91cd3c-7e69-4122-b256-f2fb9a6e0f0d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dpsxh" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.847893 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cba166d-e47a-4e78-ab6b-f5c986cc18f4-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-pbzdl\" (UID: \"8cba166d-e47a-4e78-ab6b-f5c986cc18f4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pbzdl" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.847911 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85bf6018-799b-4301-9b19-f68749c028b4-config\") pod \"apiserver-76f77b778f-c655t\" (UID: \"85bf6018-799b-4301-9b19-f68749c028b4\") " pod="openshift-apiserver/apiserver-76f77b778f-c655t" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.847926 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/85bf6018-799b-4301-9b19-f68749c028b4-encryption-config\") pod \"apiserver-76f77b778f-c655t\" (UID: \"85bf6018-799b-4301-9b19-f68749c028b4\") " pod="openshift-apiserver/apiserver-76f77b778f-c655t" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.847945 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/bfa00eae-58d2-4c7f-b232-16d1edff80f0-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-dpfg2\" (UID: \"bfa00eae-58d2-4c7f-b232-16d1edff80f0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dpfg2" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.847965 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85shx\" (UniqueName: \"kubernetes.io/projected/88c8894c-2a5e-4a5e-b3a2-83f266a23143-kube-api-access-85shx\") pod \"cluster-samples-operator-665b6dd947-shhjp\" (UID: \"88c8894c-2a5e-4a5e-b3a2-83f266a23143\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-shhjp" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.847983 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fa4879e-eded-43e9-816f-151c5f0263cc-config\") pod \"etcd-operator-b45778765-fqf9j\" (UID: \"9fa4879e-eded-43e9-816f-151c5f0263cc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fqf9j" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.848001 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9fa4879e-eded-43e9-816f-151c5f0263cc-etcd-ca\") pod \"etcd-operator-b45778765-fqf9j\" (UID: \"9fa4879e-eded-43e9-816f-151c5f0263cc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fqf9j" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.848017 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85bf6018-799b-4301-9b19-f68749c028b4-trusted-ca-bundle\") pod \"apiserver-76f77b778f-c655t\" (UID: \"85bf6018-799b-4301-9b19-f68749c028b4\") " pod="openshift-apiserver/apiserver-76f77b778f-c655t" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.848035 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/86e2f056-f89f-40db-a6e1-336632aa9afe-webhook-cert\") pod \"packageserver-d55dfcdfc-b66nb\" (UID: \"86e2f056-f89f-40db-a6e1-336632aa9afe\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b66nb" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.848059 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c108dbbb-24af-45d9-a01f-cadab889f225-console-serving-cert\") pod \"console-f9d7485db-9sbmb\" (UID: \"c108dbbb-24af-45d9-a01f-cadab889f225\") " pod="openshift-console/console-f9d7485db-9sbmb" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.848077 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40353ee4-6a92-4e39-be6f-b8249f523e36-service-ca-bundle\") pod \"router-default-5444994796-842kh\" (UID: \"40353ee4-6a92-4e39-be6f-b8249f523e36\") " pod="openshift-ingress/router-default-5444994796-842kh" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.848092 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/98307b62-646c-4668-83a2-d5f741435197-metrics-tls\") pod \"ingress-operator-5b745b69d9-7t7r7\" (UID: \"98307b62-646c-4668-83a2-d5f741435197\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7t7r7" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.848110 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9fca12a-5420-4c0f-90fc-05333ca3353f-config\") pod \"machine-approver-56656f9798-nl7ht\" (UID: \"d9fca12a-5420-4c0f-90fc-05333ca3353f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-nl7ht" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.848126 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddhbk\" (UniqueName: \"kubernetes.io/projected/bfa00eae-58d2-4c7f-b232-16d1edff80f0-kube-api-access-ddhbk\") pod \"package-server-manager-789f6589d5-dpfg2\" (UID: \"bfa00eae-58d2-4c7f-b232-16d1edff80f0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dpfg2" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.848143 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/08e24326-1ea6-4ab6-bcac-c58f25a88358-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-vdc7f\" (UID: \"08e24326-1ea6-4ab6-bcac-c58f25a88358\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vdc7f" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.848162 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e597a0aa-8325-4b8b-8691-c6b2a55bc714-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-nfp7k\" (UID: \"e597a0aa-8325-4b8b-8691-c6b2a55bc714\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nfp7k" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.848181 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/85bf6018-799b-4301-9b19-f68749c028b4-etcd-serving-ca\") pod \"apiserver-76f77b778f-c655t\" (UID: \"85bf6018-799b-4301-9b19-f68749c028b4\") " pod="openshift-apiserver/apiserver-76f77b778f-c655t" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.848203 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28796a4d-fde0-4f6e-9a06-8f72bfba6473-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9pj89\" (UID: \"28796a4d-fde0-4f6e-9a06-8f72bfba6473\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9pj89" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.848221 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d999d\" (UniqueName: \"kubernetes.io/projected/f331d75b-ad6e-4adc-88be-379c031c7d22-kube-api-access-d999d\") pod \"cluster-image-registry-operator-dc59b4c8b-xwbdh\" (UID: \"f331d75b-ad6e-4adc-88be-379c031c7d22\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xwbdh" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.848253 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/85bf6018-799b-4301-9b19-f68749c028b4-etcd-client\") pod \"apiserver-76f77b778f-c655t\" (UID: \"85bf6018-799b-4301-9b19-f68749c028b4\") " pod="openshift-apiserver/apiserver-76f77b778f-c655t" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.848273 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f099243-7cbe-4d7e-9e5e-062ed53026cb-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-v8rsv\" (UID: \"0f099243-7cbe-4d7e-9e5e-062ed53026cb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v8rsv" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.848292 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3fca2cfb-d582-4dbb-ab4c-199316fce981-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4dvpp\" (UID: \"3fca2cfb-d582-4dbb-ab4c-199316fce981\") " pod="openshift-marketplace/marketplace-operator-79b997595-4dvpp" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.848309 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4be8553-5539-4124-8106-6e65ba593dad-serving-cert\") pod \"apiserver-7bbb656c7d-8v9k5\" (UID: \"b4be8553-5539-4124-8106-6e65ba593dad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8v9k5" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.848328 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/71c06d34-b31d-47d7-8323-510c5716530e-profile-collector-cert\") pod \"catalog-operator-68c6474976-5vkz8\" (UID: \"71c06d34-b31d-47d7-8323-510c5716530e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5vkz8" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.848349 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgtlf\" (UniqueName: \"kubernetes.io/projected/71c06d34-b31d-47d7-8323-510c5716530e-kube-api-access-hgtlf\") pod \"catalog-operator-68c6474976-5vkz8\" (UID: \"71c06d34-b31d-47d7-8323-510c5716530e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5vkz8" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.848373 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f099243-7cbe-4d7e-9e5e-062ed53026cb-config\") pod \"kube-controller-manager-operator-78b949d7b-v8rsv\" (UID: \"0f099243-7cbe-4d7e-9e5e-062ed53026cb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v8rsv" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.848397 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f331d75b-ad6e-4adc-88be-379c031c7d22-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-xwbdh\" (UID: \"f331d75b-ad6e-4adc-88be-379c031c7d22\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xwbdh" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.848415 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vdnv\" (UniqueName: \"kubernetes.io/projected/d9fca12a-5420-4c0f-90fc-05333ca3353f-kube-api-access-4vdnv\") pod \"machine-approver-56656f9798-nl7ht\" (UID: \"d9fca12a-5420-4c0f-90fc-05333ca3353f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-nl7ht" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.848444 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b4be8553-5539-4124-8106-6e65ba593dad-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-8v9k5\" (UID: \"b4be8553-5539-4124-8106-6e65ba593dad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8v9k5" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.848465 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/40353ee4-6a92-4e39-be6f-b8249f523e36-stats-auth\") pod \"router-default-5444994796-842kh\" (UID: \"40353ee4-6a92-4e39-be6f-b8249f523e36\") " pod="openshift-ingress/router-default-5444994796-842kh" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.848480 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bbfeedc6-bc1b-4ea0-8e36-d97f8529c69f-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-7l8mf\" (UID: \"bbfeedc6-bc1b-4ea0-8e36-d97f8529c69f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7l8mf" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.848498 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85bf6018-799b-4301-9b19-f68749c028b4-serving-cert\") pod \"apiserver-76f77b778f-c655t\" (UID: \"85bf6018-799b-4301-9b19-f68749c028b4\") " pod="openshift-apiserver/apiserver-76f77b778f-c655t" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.848515 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqlkt\" (UniqueName: \"kubernetes.io/projected/44f5f1a9-edac-427c-b170-affcaa869772-kube-api-access-pqlkt\") pod \"openshift-config-operator-7777fb866f-mdzbj\" (UID: \"44f5f1a9-edac-427c-b170-affcaa869772\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mdzbj" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.848548 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76zfr\" (UniqueName: \"kubernetes.io/projected/08e24326-1ea6-4ab6-bcac-c58f25a88358-kube-api-access-76zfr\") pod \"multus-admission-controller-857f4d67dd-vdc7f\" (UID: \"08e24326-1ea6-4ab6-bcac-c58f25a88358\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vdc7f" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.848590 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/12ba713a-d675-4920-8367-8d6c6ccff834-srv-cert\") pod \"olm-operator-6b444d44fb-jtp76\" (UID: \"12ba713a-d675-4920-8367-8d6c6ccff834\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jtp76" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.848634 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsd6c\" (UniqueName: \"kubernetes.io/projected/ae0dbcde-8d79-4c2a-8c0b-6e64c29d52d0-kube-api-access-wsd6c\") pod \"service-ca-operator-777779d784-h4sqq\" (UID: \"ae0dbcde-8d79-4c2a-8c0b-6e64c29d52d0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-h4sqq" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.848660 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9fa4879e-eded-43e9-816f-151c5f0263cc-etcd-service-ca\") pod \"etcd-operator-b45778765-fqf9j\" (UID: \"9fa4879e-eded-43e9-816f-151c5f0263cc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fqf9j" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.848687 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c108dbbb-24af-45d9-a01f-cadab889f225-oauth-serving-cert\") pod \"console-f9d7485db-9sbmb\" (UID: \"c108dbbb-24af-45d9-a01f-cadab889f225\") " pod="openshift-console/console-f9d7485db-9sbmb" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.848713 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3fca2cfb-d582-4dbb-ab4c-199316fce981-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4dvpp\" (UID: \"3fca2cfb-d582-4dbb-ab4c-199316fce981\") " pod="openshift-marketplace/marketplace-operator-79b997595-4dvpp" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.848736 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a91cd3c-7e69-4122-b256-f2fb9a6e0f0d-serving-cert\") pod \"authentication-operator-69f744f599-dpsxh\" (UID: \"0a91cd3c-7e69-4122-b256-f2fb9a6e0f0d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dpsxh" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.848755 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b4be8553-5539-4124-8106-6e65ba593dad-audit-policies\") pod \"apiserver-7bbb656c7d-8v9k5\" (UID: \"b4be8553-5539-4124-8106-6e65ba593dad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8v9k5" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.848776 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3b82b9e9-6fee-42bd-8cdd-3bacf580f98e-images\") pod \"machine-config-operator-74547568cd-tnb4t\" (UID: \"3b82b9e9-6fee-42bd-8cdd-3bacf580f98e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tnb4t" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.848795 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/44f5f1a9-edac-427c-b170-affcaa869772-available-featuregates\") pod \"openshift-config-operator-7777fb866f-mdzbj\" (UID: \"44f5f1a9-edac-427c-b170-affcaa869772\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mdzbj" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.848813 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a22f23a9-587d-4796-a25c-6563be7a2792-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-dvbjp\" (UID: \"a22f23a9-587d-4796-a25c-6563be7a2792\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dvbjp" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.848831 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wknc\" (UniqueName: \"kubernetes.io/projected/98307b62-646c-4668-83a2-d5f741435197-kube-api-access-6wknc\") pod \"ingress-operator-5b745b69d9-7t7r7\" (UID: \"98307b62-646c-4668-83a2-d5f741435197\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7t7r7" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.848855 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f331d75b-ad6e-4adc-88be-379c031c7d22-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-xwbdh\" (UID: \"f331d75b-ad6e-4adc-88be-379c031c7d22\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xwbdh" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.848882 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jw449\" (UniqueName: \"kubernetes.io/projected/3b82b9e9-6fee-42bd-8cdd-3bacf580f98e-kube-api-access-jw449\") pod \"machine-config-operator-74547568cd-tnb4t\" (UID: \"3b82b9e9-6fee-42bd-8cdd-3bacf580f98e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tnb4t" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.848908 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a91cd3c-7e69-4122-b256-f2fb9a6e0f0d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-dpsxh\" (UID: \"0a91cd3c-7e69-4122-b256-f2fb9a6e0f0d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dpsxh" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.848948 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a91cd3c-7e69-4122-b256-f2fb9a6e0f0d-service-ca-bundle\") pod \"authentication-operator-69f744f599-dpsxh\" (UID: \"0a91cd3c-7e69-4122-b256-f2fb9a6e0f0d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dpsxh" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.848971 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d9fca12a-5420-4c0f-90fc-05333ca3353f-auth-proxy-config\") pod \"machine-approver-56656f9798-nl7ht\" (UID: \"d9fca12a-5420-4c0f-90fc-05333ca3353f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-nl7ht" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.848991 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d9fca12a-5420-4c0f-90fc-05333ca3353f-machine-approver-tls\") pod \"machine-approver-56656f9798-nl7ht\" (UID: \"d9fca12a-5420-4c0f-90fc-05333ca3353f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-nl7ht" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.849028 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbfeedc6-bc1b-4ea0-8e36-d97f8529c69f-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-7l8mf\" (UID: \"bbfeedc6-bc1b-4ea0-8e36-d97f8529c69f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7l8mf" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.849049 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3b82b9e9-6fee-42bd-8cdd-3bacf580f98e-proxy-tls\") pod \"machine-config-operator-74547568cd-tnb4t\" (UID: \"3b82b9e9-6fee-42bd-8cdd-3bacf580f98e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tnb4t" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.849066 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzxbk\" (UniqueName: \"kubernetes.io/projected/85bf6018-799b-4301-9b19-f68749c028b4-kube-api-access-rzxbk\") pod \"apiserver-76f77b778f-c655t\" (UID: \"85bf6018-799b-4301-9b19-f68749c028b4\") " pod="openshift-apiserver/apiserver-76f77b778f-c655t" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.849111 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b4be8553-5539-4124-8106-6e65ba593dad-etcd-client\") pod \"apiserver-7bbb656c7d-8v9k5\" (UID: \"b4be8553-5539-4124-8106-6e65ba593dad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8v9k5" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.849140 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/86e2f056-f89f-40db-a6e1-336632aa9afe-apiservice-cert\") pod \"packageserver-d55dfcdfc-b66nb\" (UID: \"86e2f056-f89f-40db-a6e1-336632aa9afe\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b66nb" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.849160 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/71c06d34-b31d-47d7-8323-510c5716530e-srv-cert\") pod \"catalog-operator-68c6474976-5vkz8\" (UID: \"71c06d34-b31d-47d7-8323-510c5716530e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5vkz8" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.849221 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28796a4d-fde0-4f6e-9a06-8f72bfba6473-client-ca\") pod \"controller-manager-879f6c89f-9pj89\" (UID: \"28796a4d-fde0-4f6e-9a06-8f72bfba6473\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9pj89" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.849351 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpkqs\" (UniqueName: \"kubernetes.io/projected/a22f23a9-587d-4796-a25c-6563be7a2792-kube-api-access-bpkqs\") pod \"openshift-apiserver-operator-796bbdcf4f-dvbjp\" (UID: \"a22f23a9-587d-4796-a25c-6563be7a2792\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dvbjp" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.849374 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a91cd3c-7e69-4122-b256-f2fb9a6e0f0d-config\") pod \"authentication-operator-69f744f599-dpsxh\" (UID: \"0a91cd3c-7e69-4122-b256-f2fb9a6e0f0d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dpsxh" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.849393 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae0dbcde-8d79-4c2a-8c0b-6e64c29d52d0-serving-cert\") pod \"service-ca-operator-777779d784-h4sqq\" (UID: \"ae0dbcde-8d79-4c2a-8c0b-6e64c29d52d0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-h4sqq" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.849448 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq7vx\" (UniqueName: \"kubernetes.io/projected/9fa4879e-eded-43e9-816f-151c5f0263cc-kube-api-access-hq7vx\") pod \"etcd-operator-b45778765-fqf9j\" (UID: \"9fa4879e-eded-43e9-816f-151c5f0263cc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fqf9j" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.849472 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/85bf6018-799b-4301-9b19-f68749c028b4-audit-dir\") pod \"apiserver-76f77b778f-c655t\" (UID: \"85bf6018-799b-4301-9b19-f68749c028b4\") " pod="openshift-apiserver/apiserver-76f77b778f-c655t" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.849491 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c108dbbb-24af-45d9-a01f-cadab889f225-console-oauth-config\") pod \"console-f9d7485db-9sbmb\" (UID: \"c108dbbb-24af-45d9-a01f-cadab889f225\") " pod="openshift-console/console-f9d7485db-9sbmb" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.849509 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b4be8553-5539-4124-8106-6e65ba593dad-encryption-config\") pod \"apiserver-7bbb656c7d-8v9k5\" (UID: \"b4be8553-5539-4124-8106-6e65ba593dad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8v9k5" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.849526 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvsx6\" (UniqueName: \"kubernetes.io/projected/07ed42bd-25e2-43de-bbd7-431ab818b761-kube-api-access-fvsx6\") pod \"machine-api-operator-5694c8668f-jlk5m\" (UID: \"07ed42bd-25e2-43de-bbd7-431ab818b761\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jlk5m" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.849543 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44f5f1a9-edac-427c-b170-affcaa869772-serving-cert\") pod \"openshift-config-operator-7777fb866f-mdzbj\" (UID: \"44f5f1a9-edac-427c-b170-affcaa869772\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mdzbj" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.849560 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9dq4\" (UniqueName: \"kubernetes.io/projected/e597a0aa-8325-4b8b-8691-c6b2a55bc714-kube-api-access-l9dq4\") pod \"machine-config-controller-84d6567774-nfp7k\" (UID: \"e597a0aa-8325-4b8b-8691-c6b2a55bc714\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nfp7k" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.849579 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/88c8894c-2a5e-4a5e-b3a2-83f266a23143-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-shhjp\" (UID: \"88c8894c-2a5e-4a5e-b3a2-83f266a23143\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-shhjp" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.849595 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/40353ee4-6a92-4e39-be6f-b8249f523e36-default-certificate\") pod \"router-default-5444994796-842kh\" (UID: \"40353ee4-6a92-4e39-be6f-b8249f523e36\") " pod="openshift-ingress/router-default-5444994796-842kh" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.849613 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/86e2f056-f89f-40db-a6e1-336632aa9afe-tmpfs\") pod \"packageserver-d55dfcdfc-b66nb\" (UID: \"86e2f056-f89f-40db-a6e1-336632aa9afe\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b66nb" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.849632 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b4be8553-5539-4124-8106-6e65ba593dad-audit-dir\") pod \"apiserver-7bbb656c7d-8v9k5\" (UID: \"b4be8553-5539-4124-8106-6e65ba593dad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8v9k5" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.849652 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f331d75b-ad6e-4adc-88be-379c031c7d22-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-xwbdh\" (UID: \"f331d75b-ad6e-4adc-88be-379c031c7d22\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xwbdh" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.849672 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28796a4d-fde0-4f6e-9a06-8f72bfba6473-config\") pod \"controller-manager-879f6c89f-9pj89\" (UID: \"28796a4d-fde0-4f6e-9a06-8f72bfba6473\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9pj89" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.849747 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a22f23a9-587d-4796-a25c-6563be7a2792-config\") pod \"openshift-apiserver-operator-796bbdcf4f-dvbjp\" (UID: \"a22f23a9-587d-4796-a25c-6563be7a2792\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dvbjp" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.850146 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85bf6018-799b-4301-9b19-f68749c028b4-trusted-ca-bundle\") pod \"apiserver-76f77b778f-c655t\" (UID: \"85bf6018-799b-4301-9b19-f68749c028b4\") " pod="openshift-apiserver/apiserver-76f77b778f-c655t" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.850404 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbfeedc6-bc1b-4ea0-8e36-d97f8529c69f-config\") pod \"kube-apiserver-operator-766d6c64bb-7l8mf\" (UID: \"bbfeedc6-bc1b-4ea0-8e36-d97f8529c69f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7l8mf" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.851039 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/85bf6018-799b-4301-9b19-f68749c028b4-audit\") pod \"apiserver-76f77b778f-c655t\" (UID: \"85bf6018-799b-4301-9b19-f68749c028b4\") " pod="openshift-apiserver/apiserver-76f77b778f-c655t" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.851164 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28796a4d-fde0-4f6e-9a06-8f72bfba6473-config\") pod \"controller-manager-879f6c89f-9pj89\" (UID: \"28796a4d-fde0-4f6e-9a06-8f72bfba6473\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9pj89" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.851288 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/85bf6018-799b-4301-9b19-f68749c028b4-image-import-ca\") pod \"apiserver-76f77b778f-c655t\" (UID: \"85bf6018-799b-4301-9b19-f68749c028b4\") " pod="openshift-apiserver/apiserver-76f77b778f-c655t" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.851631 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/85bf6018-799b-4301-9b19-f68749c028b4-node-pullsecrets\") pod \"apiserver-76f77b778f-c655t\" (UID: \"85bf6018-799b-4301-9b19-f68749c028b4\") " pod="openshift-apiserver/apiserver-76f77b778f-c655t" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.852125 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07ed42bd-25e2-43de-bbd7-431ab818b761-config\") pod \"machine-api-operator-5694c8668f-jlk5m\" (UID: \"07ed42bd-25e2-43de-bbd7-431ab818b761\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jlk5m" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.852392 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2901096e-6b5f-4a68-a8e4-fa91ff0575fd-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kqmwr\" (UID: \"2901096e-6b5f-4a68-a8e4-fa91ff0575fd\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kqmwr" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.852401 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/98307b62-646c-4668-83a2-d5f741435197-trusted-ca\") pod \"ingress-operator-5b745b69d9-7t7r7\" (UID: \"98307b62-646c-4668-83a2-d5f741435197\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7t7r7" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.853460 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c108dbbb-24af-45d9-a01f-cadab889f225-service-ca\") pod \"console-f9d7485db-9sbmb\" (UID: \"c108dbbb-24af-45d9-a01f-cadab889f225\") " pod="openshift-console/console-f9d7485db-9sbmb" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.853874 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9fca12a-5420-4c0f-90fc-05333ca3353f-config\") pod \"machine-approver-56656f9798-nl7ht\" (UID: \"d9fca12a-5420-4c0f-90fc-05333ca3353f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-nl7ht" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.853956 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2901096e-6b5f-4a68-a8e4-fa91ff0575fd-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kqmwr\" (UID: \"2901096e-6b5f-4a68-a8e4-fa91ff0575fd\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kqmwr" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.854410 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c108dbbb-24af-45d9-a01f-cadab889f225-trusted-ca-bundle\") pod \"console-f9d7485db-9sbmb\" (UID: \"c108dbbb-24af-45d9-a01f-cadab889f225\") " pod="openshift-console/console-f9d7485db-9sbmb" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.854211 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c108dbbb-24af-45d9-a01f-cadab889f225-console-config\") pod \"console-f9d7485db-9sbmb\" (UID: \"c108dbbb-24af-45d9-a01f-cadab889f225\") " pod="openshift-console/console-f9d7485db-9sbmb" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.854810 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/85bf6018-799b-4301-9b19-f68749c028b4-audit-dir\") pod \"apiserver-76f77b778f-c655t\" (UID: \"85bf6018-799b-4301-9b19-f68749c028b4\") " pod="openshift-apiserver/apiserver-76f77b778f-c655t" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.854848 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85bf6018-799b-4301-9b19-f68749c028b4-config\") pod \"apiserver-76f77b778f-c655t\" (UID: \"85bf6018-799b-4301-9b19-f68749c028b4\") " pod="openshift-apiserver/apiserver-76f77b778f-c655t" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.855976 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a91cd3c-7e69-4122-b256-f2fb9a6e0f0d-config\") pod \"authentication-operator-69f744f599-dpsxh\" (UID: \"0a91cd3c-7e69-4122-b256-f2fb9a6e0f0d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dpsxh" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.856057 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b4be8553-5539-4124-8106-6e65ba593dad-audit-dir\") pod \"apiserver-7bbb656c7d-8v9k5\" (UID: \"b4be8553-5539-4124-8106-6e65ba593dad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8v9k5" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.856096 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/98307b62-646c-4668-83a2-d5f741435197-metrics-tls\") pod \"ingress-operator-5b745b69d9-7t7r7\" (UID: \"98307b62-646c-4668-83a2-d5f741435197\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7t7r7" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.856498 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/86e2f056-f89f-40db-a6e1-336632aa9afe-tmpfs\") pod \"packageserver-d55dfcdfc-b66nb\" (UID: \"86e2f056-f89f-40db-a6e1-336632aa9afe\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b66nb" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.849112 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.857914 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/07ed42bd-25e2-43de-bbd7-431ab818b761-images\") pod \"machine-api-operator-5694c8668f-jlk5m\" (UID: \"07ed42bd-25e2-43de-bbd7-431ab818b761\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jlk5m" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.855986 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a91cd3c-7e69-4122-b256-f2fb9a6e0f0d-service-ca-bundle\") pod \"authentication-operator-69f744f599-dpsxh\" (UID: \"0a91cd3c-7e69-4122-b256-f2fb9a6e0f0d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dpsxh" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.858119 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e597a0aa-8325-4b8b-8691-c6b2a55bc714-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-nfp7k\" (UID: \"e597a0aa-8325-4b8b-8691-c6b2a55bc714\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nfp7k" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.859179 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d9fca12a-5420-4c0f-90fc-05333ca3353f-auth-proxy-config\") pod \"machine-approver-56656f9798-nl7ht\" (UID: \"d9fca12a-5420-4c0f-90fc-05333ca3353f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-nl7ht" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.859814 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28796a4d-fde0-4f6e-9a06-8f72bfba6473-serving-cert\") pod \"controller-manager-879f6c89f-9pj89\" (UID: \"28796a4d-fde0-4f6e-9a06-8f72bfba6473\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9pj89" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.859792 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/07ed42bd-25e2-43de-bbd7-431ab818b761-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-jlk5m\" (UID: \"07ed42bd-25e2-43de-bbd7-431ab818b761\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jlk5m" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.860611 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/12ba713a-d675-4920-8367-8d6c6ccff834-profile-collector-cert\") pod \"olm-operator-6b444d44fb-jtp76\" (UID: \"12ba713a-d675-4920-8367-8d6c6ccff834\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jtp76" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.860633 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c108dbbb-24af-45d9-a01f-cadab889f225-console-serving-cert\") pod \"console-f9d7485db-9sbmb\" (UID: \"c108dbbb-24af-45d9-a01f-cadab889f225\") " pod="openshift-console/console-f9d7485db-9sbmb" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.860702 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a91cd3c-7e69-4122-b256-f2fb9a6e0f0d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-dpsxh\" (UID: \"0a91cd3c-7e69-4122-b256-f2fb9a6e0f0d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dpsxh" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.860917 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/88c8894c-2a5e-4a5e-b3a2-83f266a23143-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-shhjp\" (UID: \"88c8894c-2a5e-4a5e-b3a2-83f266a23143\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-shhjp" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.861301 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28796a4d-fde0-4f6e-9a06-8f72bfba6473-client-ca\") pod \"controller-manager-879f6c89f-9pj89\" (UID: \"28796a4d-fde0-4f6e-9a06-8f72bfba6473\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9pj89" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.862035 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/44f5f1a9-edac-427c-b170-affcaa869772-available-featuregates\") pod \"openshift-config-operator-7777fb866f-mdzbj\" (UID: \"44f5f1a9-edac-427c-b170-affcaa869772\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mdzbj" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.862036 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c108dbbb-24af-45d9-a01f-cadab889f225-oauth-serving-cert\") pod \"console-f9d7485db-9sbmb\" (UID: \"c108dbbb-24af-45d9-a01f-cadab889f225\") " pod="openshift-console/console-f9d7485db-9sbmb" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.862364 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/85bf6018-799b-4301-9b19-f68749c028b4-etcd-serving-ca\") pod \"apiserver-76f77b778f-c655t\" (UID: \"85bf6018-799b-4301-9b19-f68749c028b4\") " pod="openshift-apiserver/apiserver-76f77b778f-c655t" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.862652 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28796a4d-fde0-4f6e-9a06-8f72bfba6473-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9pj89\" (UID: \"28796a4d-fde0-4f6e-9a06-8f72bfba6473\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9pj89" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.863896 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3b82b9e9-6fee-42bd-8cdd-3bacf580f98e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-tnb4t\" (UID: \"3b82b9e9-6fee-42bd-8cdd-3bacf580f98e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tnb4t" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.865703 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.866510 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f331d75b-ad6e-4adc-88be-379c031c7d22-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-xwbdh\" (UID: \"f331d75b-ad6e-4adc-88be-379c031c7d22\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xwbdh" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.868265 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/85bf6018-799b-4301-9b19-f68749c028b4-serving-cert\") pod \"apiserver-76f77b778f-c655t\" (UID: \"85bf6018-799b-4301-9b19-f68749c028b4\") " pod="openshift-apiserver/apiserver-76f77b778f-c655t" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.869258 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/85bf6018-799b-4301-9b19-f68749c028b4-etcd-client\") pod \"apiserver-76f77b778f-c655t\" (UID: \"85bf6018-799b-4301-9b19-f68749c028b4\") " pod="openshift-apiserver/apiserver-76f77b778f-c655t" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.869578 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f099243-7cbe-4d7e-9e5e-062ed53026cb-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-v8rsv\" (UID: \"0f099243-7cbe-4d7e-9e5e-062ed53026cb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v8rsv" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.869716 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f099243-7cbe-4d7e-9e5e-062ed53026cb-config\") pod \"kube-controller-manager-operator-78b949d7b-v8rsv\" (UID: \"0f099243-7cbe-4d7e-9e5e-062ed53026cb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v8rsv" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.869960 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a91cd3c-7e69-4122-b256-f2fb9a6e0f0d-serving-cert\") pod \"authentication-operator-69f744f599-dpsxh\" (UID: \"0a91cd3c-7e69-4122-b256-f2fb9a6e0f0d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dpsxh" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.871097 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/12ba713a-d675-4920-8367-8d6c6ccff834-srv-cert\") pod \"olm-operator-6b444d44fb-jtp76\" (UID: \"12ba713a-d675-4920-8367-8d6c6ccff834\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jtp76" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.871283 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/71c06d34-b31d-47d7-8323-510c5716530e-profile-collector-cert\") pod \"catalog-operator-68c6474976-5vkz8\" (UID: \"71c06d34-b31d-47d7-8323-510c5716530e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5vkz8" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.871619 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44f5f1a9-edac-427c-b170-affcaa869772-serving-cert\") pod \"openshift-config-operator-7777fb866f-mdzbj\" (UID: \"44f5f1a9-edac-427c-b170-affcaa869772\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mdzbj" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.877786 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/bfa00eae-58d2-4c7f-b232-16d1edff80f0-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-dpfg2\" (UID: \"bfa00eae-58d2-4c7f-b232-16d1edff80f0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dpfg2" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.879156 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d9fca12a-5420-4c0f-90fc-05333ca3353f-machine-approver-tls\") pod \"machine-approver-56656f9798-nl7ht\" (UID: \"d9fca12a-5420-4c0f-90fc-05333ca3353f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-nl7ht" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.879297 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/85bf6018-799b-4301-9b19-f68749c028b4-encryption-config\") pod \"apiserver-76f77b778f-c655t\" (UID: \"85bf6018-799b-4301-9b19-f68749c028b4\") " pod="openshift-apiserver/apiserver-76f77b778f-c655t" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.879865 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c108dbbb-24af-45d9-a01f-cadab889f225-console-oauth-config\") pod \"console-f9d7485db-9sbmb\" (UID: \"c108dbbb-24af-45d9-a01f-cadab889f225\") " pod="openshift-console/console-f9d7485db-9sbmb" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.880105 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/86e2f056-f89f-40db-a6e1-336632aa9afe-webhook-cert\") pod \"packageserver-d55dfcdfc-b66nb\" (UID: \"86e2f056-f89f-40db-a6e1-336632aa9afe\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b66nb" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.880302 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/86e2f056-f89f-40db-a6e1-336632aa9afe-apiservice-cert\") pod \"packageserver-d55dfcdfc-b66nb\" (UID: \"86e2f056-f89f-40db-a6e1-336632aa9afe\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b66nb" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.881577 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a22f23a9-587d-4796-a25c-6563be7a2792-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-dvbjp\" (UID: \"a22f23a9-587d-4796-a25c-6563be7a2792\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dvbjp" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.882232 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbfeedc6-bc1b-4ea0-8e36-d97f8529c69f-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-7l8mf\" (UID: \"bbfeedc6-bc1b-4ea0-8e36-d97f8529c69f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7l8mf" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.885798 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.893336 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/c93ff986-8043-4c13-b1a1-d24305361338-signing-cabundle\") pod \"service-ca-9c57cc56f-5nbqq\" (UID: \"c93ff986-8043-4c13-b1a1-d24305361338\") " pod="openshift-service-ca/service-ca-9c57cc56f-5nbqq" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.905664 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.927960 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.945189 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.957133 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/c93ff986-8043-4c13-b1a1-d24305361338-signing-key\") pod \"service-ca-9c57cc56f-5nbqq\" (UID: \"c93ff986-8043-4c13-b1a1-d24305361338\") " pod="openshift-service-ca/service-ca-9c57cc56f-5nbqq" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.965590 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Nov 22 04:09:59 crc kubenswrapper[4699]: I1122 04:09:59.984904 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.005446 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.024730 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.046038 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.085706 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.095079 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9fa4879e-eded-43e9-816f-151c5f0263cc-serving-cert\") pod \"etcd-operator-b45778765-fqf9j\" (UID: \"9fa4879e-eded-43e9-816f-151c5f0263cc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fqf9j" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.104767 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.125063 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.136443 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9fa4879e-eded-43e9-816f-151c5f0263cc-etcd-client\") pod \"etcd-operator-b45778765-fqf9j\" (UID: \"9fa4879e-eded-43e9-816f-151c5f0263cc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fqf9j" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.145109 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.154684 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fa4879e-eded-43e9-816f-151c5f0263cc-config\") pod \"etcd-operator-b45778765-fqf9j\" (UID: \"9fa4879e-eded-43e9-816f-151c5f0263cc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fqf9j" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.165493 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.186166 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.196074 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9fa4879e-eded-43e9-816f-151c5f0263cc-etcd-ca\") pod \"etcd-operator-b45778765-fqf9j\" (UID: \"9fa4879e-eded-43e9-816f-151c5f0263cc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fqf9j" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.205493 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.225091 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.233154 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9fa4879e-eded-43e9-816f-151c5f0263cc-etcd-service-ca\") pod \"etcd-operator-b45778765-fqf9j\" (UID: \"9fa4879e-eded-43e9-816f-151c5f0263cc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fqf9j" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.265106 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8mxk\" (UniqueName: \"kubernetes.io/projected/a2f1bf82-73fb-4dcc-82e1-7d521ad29241-kube-api-access-w8mxk\") pod \"route-controller-manager-6576b87f9c-9rpts\" (UID: \"a2f1bf82-73fb-4dcc-82e1-7d521ad29241\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rpts" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.267877 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.285407 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.291694 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/40353ee4-6a92-4e39-be6f-b8249f523e36-stats-auth\") pod \"router-default-5444994796-842kh\" (UID: \"40353ee4-6a92-4e39-be6f-b8249f523e36\") " pod="openshift-ingress/router-default-5444994796-842kh" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.305131 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.333342 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.336840 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/40353ee4-6a92-4e39-be6f-b8249f523e36-metrics-certs\") pod \"router-default-5444994796-842kh\" (UID: \"40353ee4-6a92-4e39-be6f-b8249f523e36\") " pod="openshift-ingress/router-default-5444994796-842kh" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.345628 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.357888 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rpts" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.358766 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/40353ee4-6a92-4e39-be6f-b8249f523e36-default-certificate\") pod \"router-default-5444994796-842kh\" (UID: \"40353ee4-6a92-4e39-be6f-b8249f523e36\") " pod="openshift-ingress/router-default-5444994796-842kh" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.365754 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.373795 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40353ee4-6a92-4e39-be6f-b8249f523e36-service-ca-bundle\") pod \"router-default-5444994796-842kh\" (UID: \"40353ee4-6a92-4e39-be6f-b8249f523e36\") " pod="openshift-ingress/router-default-5444994796-842kh" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.391184 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.405674 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.425447 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.449169 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.452251 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b4be8553-5539-4124-8106-6e65ba593dad-audit-policies\") pod \"apiserver-7bbb656c7d-8v9k5\" (UID: \"b4be8553-5539-4124-8106-6e65ba593dad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8v9k5" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.465006 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.474286 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b4be8553-5539-4124-8106-6e65ba593dad-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-8v9k5\" (UID: \"b4be8553-5539-4124-8106-6e65ba593dad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8v9k5" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.485250 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.491938 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b4be8553-5539-4124-8106-6e65ba593dad-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-8v9k5\" (UID: \"b4be8553-5539-4124-8106-6e65ba593dad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8v9k5" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.505896 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.526545 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.537918 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b4be8553-5539-4124-8106-6e65ba593dad-etcd-client\") pod \"apiserver-7bbb656c7d-8v9k5\" (UID: \"b4be8553-5539-4124-8106-6e65ba593dad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8v9k5" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.545722 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.554426 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rpts"] Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.561318 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4be8553-5539-4124-8106-6e65ba593dad-serving-cert\") pod \"apiserver-7bbb656c7d-8v9k5\" (UID: \"b4be8553-5539-4124-8106-6e65ba593dad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8v9k5" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.565474 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.575326 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b4be8553-5539-4124-8106-6e65ba593dad-encryption-config\") pod \"apiserver-7bbb656c7d-8v9k5\" (UID: \"b4be8553-5539-4124-8106-6e65ba593dad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8v9k5" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.587726 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.603059 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e597a0aa-8325-4b8b-8691-c6b2a55bc714-proxy-tls\") pod \"machine-config-controller-84d6567774-nfp7k\" (UID: \"e597a0aa-8325-4b8b-8691-c6b2a55bc714\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nfp7k" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.605105 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.625661 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.645912 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.663930 4699 request.go:700] Waited for 1.012807836s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager-operator/secrets?fieldSelector=metadata.name%3Dopenshift-controller-manager-operator-serving-cert&limit=500&resourceVersion=0 Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.665196 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.679714 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cba166d-e47a-4e78-ab6b-f5c986cc18f4-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-pbzdl\" (UID: \"8cba166d-e47a-4e78-ab6b-f5c986cc18f4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pbzdl" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.685730 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.704889 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.714794 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cba166d-e47a-4e78-ab6b-f5c986cc18f4-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-pbzdl\" (UID: \"8cba166d-e47a-4e78-ab6b-f5c986cc18f4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pbzdl" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.725236 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.745722 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.765055 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.785651 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.805003 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.825613 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.845772 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Nov 22 04:10:00 crc kubenswrapper[4699]: E1122 04:10:00.854021 4699 configmap.go:193] Couldn't get configMap openshift-service-ca-operator/service-ca-operator-config: failed to sync configmap cache: timed out waiting for the condition Nov 22 04:10:00 crc kubenswrapper[4699]: E1122 04:10:00.854094 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ae0dbcde-8d79-4c2a-8c0b-6e64c29d52d0-config podName:ae0dbcde-8d79-4c2a-8c0b-6e64c29d52d0 nodeName:}" failed. No retries permitted until 2025-11-22 04:10:01.354077295 +0000 UTC m=+152.696698482 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/ae0dbcde-8d79-4c2a-8c0b-6e64c29d52d0-config") pod "service-ca-operator-777779d784-h4sqq" (UID: "ae0dbcde-8d79-4c2a-8c0b-6e64c29d52d0") : failed to sync configmap cache: timed out waiting for the condition Nov 22 04:10:00 crc kubenswrapper[4699]: E1122 04:10:00.855180 4699 secret.go:188] Couldn't get secret openshift-service-ca-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Nov 22 04:10:00 crc kubenswrapper[4699]: E1122 04:10:00.855351 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae0dbcde-8d79-4c2a-8c0b-6e64c29d52d0-serving-cert podName:ae0dbcde-8d79-4c2a-8c0b-6e64c29d52d0 nodeName:}" failed. No retries permitted until 2025-11-22 04:10:01.355312415 +0000 UTC m=+152.697933782 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/ae0dbcde-8d79-4c2a-8c0b-6e64c29d52d0-serving-cert") pod "service-ca-operator-777779d784-h4sqq" (UID: "ae0dbcde-8d79-4c2a-8c0b-6e64c29d52d0") : failed to sync secret cache: timed out waiting for the condition Nov 22 04:10:00 crc kubenswrapper[4699]: E1122 04:10:00.856300 4699 secret.go:188] Couldn't get secret openshift-multus/multus-admission-controller-secret: failed to sync secret cache: timed out waiting for the condition Nov 22 04:10:00 crc kubenswrapper[4699]: E1122 04:10:00.856593 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/08e24326-1ea6-4ab6-bcac-c58f25a88358-webhook-certs podName:08e24326-1ea6-4ab6-bcac-c58f25a88358 nodeName:}" failed. No retries permitted until 2025-11-22 04:10:01.356565546 +0000 UTC m=+152.699186913 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/08e24326-1ea6-4ab6-bcac-c58f25a88358-webhook-certs") pod "multus-admission-controller-857f4d67dd-vdc7f" (UID: "08e24326-1ea6-4ab6-bcac-c58f25a88358") : failed to sync secret cache: timed out waiting for the condition Nov 22 04:10:00 crc kubenswrapper[4699]: E1122 04:10:00.859483 4699 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Nov 22 04:10:00 crc kubenswrapper[4699]: E1122 04:10:00.859521 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71c06d34-b31d-47d7-8323-510c5716530e-srv-cert podName:71c06d34-b31d-47d7-8323-510c5716530e nodeName:}" failed. No retries permitted until 2025-11-22 04:10:01.359512508 +0000 UTC m=+152.702133695 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/71c06d34-b31d-47d7-8323-510c5716530e-srv-cert") pod "catalog-operator-68c6474976-5vkz8" (UID: "71c06d34-b31d-47d7-8323-510c5716530e") : failed to sync secret cache: timed out waiting for the condition Nov 22 04:10:00 crc kubenswrapper[4699]: E1122 04:10:00.860953 4699 secret.go:188] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: failed to sync secret cache: timed out waiting for the condition Nov 22 04:10:00 crc kubenswrapper[4699]: E1122 04:10:00.860990 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b82b9e9-6fee-42bd-8cdd-3bacf580f98e-proxy-tls podName:3b82b9e9-6fee-42bd-8cdd-3bacf580f98e nodeName:}" failed. No retries permitted until 2025-11-22 04:10:01.360982005 +0000 UTC m=+152.703603192 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/3b82b9e9-6fee-42bd-8cdd-3bacf580f98e-proxy-tls") pod "machine-config-operator-74547568cd-tnb4t" (UID: "3b82b9e9-6fee-42bd-8cdd-3bacf580f98e") : failed to sync secret cache: timed out waiting for the condition Nov 22 04:10:00 crc kubenswrapper[4699]: E1122 04:10:00.861250 4699 configmap.go:193] Couldn't get configMap openshift-marketplace/marketplace-trusted-ca: failed to sync configmap cache: timed out waiting for the condition Nov 22 04:10:00 crc kubenswrapper[4699]: E1122 04:10:00.861470 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3fca2cfb-d582-4dbb-ab4c-199316fce981-marketplace-trusted-ca podName:3fca2cfb-d582-4dbb-ab4c-199316fce981 nodeName:}" failed. No retries permitted until 2025-11-22 04:10:01.361426636 +0000 UTC m=+152.704047853 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-trusted-ca" (UniqueName: "kubernetes.io/configmap/3fca2cfb-d582-4dbb-ab4c-199316fce981-marketplace-trusted-ca") pod "marketplace-operator-79b997595-4dvpp" (UID: "3fca2cfb-d582-4dbb-ab4c-199316fce981") : failed to sync configmap cache: timed out waiting for the condition Nov 22 04:10:00 crc kubenswrapper[4699]: E1122 04:10:00.861532 4699 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/machine-config-operator-images: failed to sync configmap cache: timed out waiting for the condition Nov 22 04:10:00 crc kubenswrapper[4699]: E1122 04:10:00.861806 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3b82b9e9-6fee-42bd-8cdd-3bacf580f98e-images podName:3b82b9e9-6fee-42bd-8cdd-3bacf580f98e nodeName:}" failed. No retries permitted until 2025-11-22 04:10:01.361789094 +0000 UTC m=+152.704410321 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/3b82b9e9-6fee-42bd-8cdd-3bacf580f98e-images") pod "machine-config-operator-74547568cd-tnb4t" (UID: "3b82b9e9-6fee-42bd-8cdd-3bacf580f98e") : failed to sync configmap cache: timed out waiting for the condition Nov 22 04:10:00 crc kubenswrapper[4699]: E1122 04:10:00.863714 4699 secret.go:188] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: failed to sync secret cache: timed out waiting for the condition Nov 22 04:10:00 crc kubenswrapper[4699]: E1122 04:10:00.863836 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fca2cfb-d582-4dbb-ab4c-199316fce981-marketplace-operator-metrics podName:3fca2cfb-d582-4dbb-ab4c-199316fce981 nodeName:}" failed. No retries permitted until 2025-11-22 04:10:01.363814664 +0000 UTC m=+152.706435881 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/3fca2cfb-d582-4dbb-ab4c-199316fce981-marketplace-operator-metrics") pod "marketplace-operator-79b997595-4dvpp" (UID: "3fca2cfb-d582-4dbb-ab4c-199316fce981") : failed to sync secret cache: timed out waiting for the condition Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.865624 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.885655 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.905834 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.926528 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.946191 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.966142 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Nov 22 04:10:00 crc kubenswrapper[4699]: I1122 04:10:00.986253 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.005044 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.025596 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.046321 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.066590 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.085533 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.105232 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.125333 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.144915 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.165351 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.186424 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.205258 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.233643 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.246140 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.265886 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.286001 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.307028 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.324846 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.384395 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/71c06d34-b31d-47d7-8323-510c5716530e-srv-cert\") pod \"catalog-operator-68c6474976-5vkz8\" (UID: \"71c06d34-b31d-47d7-8323-510c5716530e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5vkz8" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.384475 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae0dbcde-8d79-4c2a-8c0b-6e64c29d52d0-serving-cert\") pod \"service-ca-operator-777779d784-h4sqq\" (UID: \"ae0dbcde-8d79-4c2a-8c0b-6e64c29d52d0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-h4sqq" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.384569 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae0dbcde-8d79-4c2a-8c0b-6e64c29d52d0-config\") pod \"service-ca-operator-777779d784-h4sqq\" (UID: \"ae0dbcde-8d79-4c2a-8c0b-6e64c29d52d0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-h4sqq" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.384697 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/08e24326-1ea6-4ab6-bcac-c58f25a88358-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-vdc7f\" (UID: \"08e24326-1ea6-4ab6-bcac-c58f25a88358\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vdc7f" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.384731 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3fca2cfb-d582-4dbb-ab4c-199316fce981-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4dvpp\" (UID: \"3fca2cfb-d582-4dbb-ab4c-199316fce981\") " pod="openshift-marketplace/marketplace-operator-79b997595-4dvpp" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.384784 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3fca2cfb-d582-4dbb-ab4c-199316fce981-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4dvpp\" (UID: \"3fca2cfb-d582-4dbb-ab4c-199316fce981\") " pod="openshift-marketplace/marketplace-operator-79b997595-4dvpp" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.384820 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3b82b9e9-6fee-42bd-8cdd-3bacf580f98e-images\") pod \"machine-config-operator-74547568cd-tnb4t\" (UID: \"3b82b9e9-6fee-42bd-8cdd-3bacf580f98e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tnb4t" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.384857 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3b82b9e9-6fee-42bd-8cdd-3bacf580f98e-proxy-tls\") pod \"machine-config-operator-74547568cd-tnb4t\" (UID: \"3b82b9e9-6fee-42bd-8cdd-3bacf580f98e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tnb4t" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.385768 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae0dbcde-8d79-4c2a-8c0b-6e64c29d52d0-config\") pod \"service-ca-operator-777779d784-h4sqq\" (UID: \"ae0dbcde-8d79-4c2a-8c0b-6e64c29d52d0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-h4sqq" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.387083 4699 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.387796 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3b82b9e9-6fee-42bd-8cdd-3bacf580f98e-images\") pod \"machine-config-operator-74547568cd-tnb4t\" (UID: \"3b82b9e9-6fee-42bd-8cdd-3bacf580f98e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tnb4t" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.390386 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3fca2cfb-d582-4dbb-ab4c-199316fce981-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4dvpp\" (UID: \"3fca2cfb-d582-4dbb-ab4c-199316fce981\") " pod="openshift-marketplace/marketplace-operator-79b997595-4dvpp" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.390511 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3fca2cfb-d582-4dbb-ab4c-199316fce981-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4dvpp\" (UID: \"3fca2cfb-d582-4dbb-ab4c-199316fce981\") " pod="openshift-marketplace/marketplace-operator-79b997595-4dvpp" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.395014 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae0dbcde-8d79-4c2a-8c0b-6e64c29d52d0-serving-cert\") pod \"service-ca-operator-777779d784-h4sqq\" (UID: \"ae0dbcde-8d79-4c2a-8c0b-6e64c29d52d0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-h4sqq" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.406158 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/71c06d34-b31d-47d7-8323-510c5716530e-srv-cert\") pod \"catalog-operator-68c6474976-5vkz8\" (UID: \"71c06d34-b31d-47d7-8323-510c5716530e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5vkz8" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.406479 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3b82b9e9-6fee-42bd-8cdd-3bacf580f98e-proxy-tls\") pod \"machine-config-operator-74547568cd-tnb4t\" (UID: \"3b82b9e9-6fee-42bd-8cdd-3bacf580f98e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tnb4t" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.406714 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzwtk\" (UniqueName: \"kubernetes.io/projected/b3173c9e-b8ff-4407-bb12-660219ce7a55-kube-api-access-lzwtk\") pod \"oauth-openshift-558db77b4-qc8mt\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.407304 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/08e24326-1ea6-4ab6-bcac-c58f25a88358-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-vdc7f\" (UID: \"08e24326-1ea6-4ab6-bcac-c58f25a88358\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vdc7f" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.407391 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.410796 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rpts" event={"ID":"a2f1bf82-73fb-4dcc-82e1-7d521ad29241","Type":"ContainerStarted","Data":"a70b9dfd0581f75ac83ce4633fd0ec96f0c6437b20115e9124ea6335ab1f4388"} Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.410847 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rpts" event={"ID":"a2f1bf82-73fb-4dcc-82e1-7d521ad29241","Type":"ContainerStarted","Data":"364f481d4dac3795b089009abcb0db3aafb910e97c2580bb39aacdba90c9ea8b"} Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.411678 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rpts" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.425165 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.446605 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.458365 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.464973 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.485452 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.505956 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.528901 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.545178 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.566205 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.610033 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82pp8\" (UniqueName: \"kubernetes.io/projected/8cba166d-e47a-4e78-ab6b-f5c986cc18f4-kube-api-access-82pp8\") pod \"openshift-controller-manager-operator-756b6f6bc6-pbzdl\" (UID: \"8cba166d-e47a-4e78-ab6b-f5c986cc18f4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pbzdl" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.631341 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bptp\" (UniqueName: \"kubernetes.io/projected/86e2f056-f89f-40db-a6e1-336632aa9afe-kube-api-access-8bptp\") pod \"packageserver-d55dfcdfc-b66nb\" (UID: \"86e2f056-f89f-40db-a6e1-336632aa9afe\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b66nb" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.644122 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/98307b62-646c-4668-83a2-d5f741435197-bound-sa-token\") pod \"ingress-operator-5b745b69d9-7t7r7\" (UID: \"98307b62-646c-4668-83a2-d5f741435197\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7t7r7" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.660350 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2901096e-6b5f-4a68-a8e4-fa91ff0575fd-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kqmwr\" (UID: \"2901096e-6b5f-4a68-a8e4-fa91ff0575fd\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kqmwr" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.681362 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk7bc\" (UniqueName: \"kubernetes.io/projected/3fca2cfb-d582-4dbb-ab4c-199316fce981-kube-api-access-lk7bc\") pod \"marketplace-operator-79b997595-4dvpp\" (UID: \"3fca2cfb-d582-4dbb-ab4c-199316fce981\") " pod="openshift-marketplace/marketplace-operator-79b997595-4dvpp" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.681490 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-qc8mt"] Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.683972 4699 request.go:700] Waited for 1.832530503s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/serviceaccounts/default/token Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.692659 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pbzdl" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.713145 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmdcv\" (UniqueName: \"kubernetes.io/projected/854975d3-e251-4354-a2c8-84ea7da296a8-kube-api-access-wmdcv\") pod \"downloads-7954f5f757-2gxkc\" (UID: \"854975d3-e251-4354-a2c8-84ea7da296a8\") " pod="openshift-console/downloads-7954f5f757-2gxkc" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.723893 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz6kd\" (UniqueName: \"kubernetes.io/projected/b4be8553-5539-4124-8106-6e65ba593dad-kube-api-access-rz6kd\") pod \"apiserver-7bbb656c7d-8v9k5\" (UID: \"b4be8553-5539-4124-8106-6e65ba593dad\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8v9k5" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.742632 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfcdl\" (UniqueName: \"kubernetes.io/projected/40353ee4-6a92-4e39-be6f-b8249f523e36-kube-api-access-gfcdl\") pod \"router-default-5444994796-842kh\" (UID: \"40353ee4-6a92-4e39-be6f-b8249f523e36\") " pod="openshift-ingress/router-default-5444994796-842kh" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.759449 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-2gxkc" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.759803 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcks4\" (UniqueName: \"kubernetes.io/projected/12ba713a-d675-4920-8367-8d6c6ccff834-kube-api-access-hcks4\") pod \"olm-operator-6b444d44fb-jtp76\" (UID: \"12ba713a-d675-4920-8367-8d6c6ccff834\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jtp76" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.780137 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4dvpp" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.783988 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94hm7\" (UniqueName: \"kubernetes.io/projected/c108dbbb-24af-45d9-a01f-cadab889f225-kube-api-access-94hm7\") pod \"console-f9d7485db-9sbmb\" (UID: \"c108dbbb-24af-45d9-a01f-cadab889f225\") " pod="openshift-console/console-f9d7485db-9sbmb" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.804658 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wblnl\" (UniqueName: \"kubernetes.io/projected/c93ff986-8043-4c13-b1a1-d24305361338-kube-api-access-wblnl\") pod \"service-ca-9c57cc56f-5nbqq\" (UID: \"c93ff986-8043-4c13-b1a1-d24305361338\") " pod="openshift-service-ca/service-ca-9c57cc56f-5nbqq" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.804895 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kqmwr" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.817128 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b66nb" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.823342 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh2wv\" (UniqueName: \"kubernetes.io/projected/4e4eacc6-98af-4a4d-a161-a8629b46c1ef-kube-api-access-lh2wv\") pod \"migrator-59844c95c7-n4cff\" (UID: \"4e4eacc6-98af-4a4d-a161-a8629b46c1ef\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-n4cff" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.826709 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-9sbmb" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.840999 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85shx\" (UniqueName: \"kubernetes.io/projected/88c8894c-2a5e-4a5e-b3a2-83f266a23143-kube-api-access-85shx\") pod \"cluster-samples-operator-665b6dd947-shhjp\" (UID: \"88c8894c-2a5e-4a5e-b3a2-83f266a23143\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-shhjp" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.863707 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddhbk\" (UniqueName: \"kubernetes.io/projected/bfa00eae-58d2-4c7f-b232-16d1edff80f0-kube-api-access-ddhbk\") pod \"package-server-manager-789f6589d5-dpfg2\" (UID: \"bfa00eae-58d2-4c7f-b232-16d1edff80f0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dpfg2" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.890913 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jtp76" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.896942 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcfpq\" (UniqueName: \"kubernetes.io/projected/0a91cd3c-7e69-4122-b256-f2fb9a6e0f0d-kube-api-access-kcfpq\") pod \"authentication-operator-69f744f599-dpsxh\" (UID: \"0a91cd3c-7e69-4122-b256-f2fb9a6e0f0d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dpsxh" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.902659 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-n4cff" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.906078 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rpts" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.911743 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wknc\" (UniqueName: \"kubernetes.io/projected/98307b62-646c-4668-83a2-d5f741435197-kube-api-access-6wknc\") pod \"ingress-operator-5b745b69d9-7t7r7\" (UID: \"98307b62-646c-4668-83a2-d5f741435197\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7t7r7" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.912099 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-dpsxh" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.920611 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dpfg2" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.924209 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpkqs\" (UniqueName: \"kubernetes.io/projected/a22f23a9-587d-4796-a25c-6563be7a2792-kube-api-access-bpkqs\") pod \"openshift-apiserver-operator-796bbdcf4f-dvbjp\" (UID: \"a22f23a9-587d-4796-a25c-6563be7a2792\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dvbjp" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.930898 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pbzdl"] Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.931083 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-5nbqq" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.948047 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw449\" (UniqueName: \"kubernetes.io/projected/3b82b9e9-6fee-42bd-8cdd-3bacf580f98e-kube-api-access-jw449\") pod \"machine-config-operator-74547568cd-tnb4t\" (UID: \"3b82b9e9-6fee-42bd-8cdd-3bacf580f98e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tnb4t" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.976976 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-842kh" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.978288 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvsx6\" (UniqueName: \"kubernetes.io/projected/07ed42bd-25e2-43de-bbd7-431ab818b761-kube-api-access-fvsx6\") pod \"machine-api-operator-5694c8668f-jlk5m\" (UID: \"07ed42bd-25e2-43de-bbd7-431ab818b761\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jlk5m" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.978606 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8v9k5" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.980036 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-jlk5m" Nov 22 04:10:01 crc kubenswrapper[4699]: I1122 04:10:01.992984 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq7vx\" (UniqueName: \"kubernetes.io/projected/9fa4879e-eded-43e9-816f-151c5f0263cc-kube-api-access-hq7vx\") pod \"etcd-operator-b45778765-fqf9j\" (UID: \"9fa4879e-eded-43e9-816f-151c5f0263cc\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fqf9j" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.006866 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9dq4\" (UniqueName: \"kubernetes.io/projected/e597a0aa-8325-4b8b-8691-c6b2a55bc714-kube-api-access-l9dq4\") pod \"machine-config-controller-84d6567774-nfp7k\" (UID: \"e597a0aa-8325-4b8b-8691-c6b2a55bc714\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nfp7k" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.030814 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tnb4t" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.031840 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0f099243-7cbe-4d7e-9e5e-062ed53026cb-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-v8rsv\" (UID: \"0f099243-7cbe-4d7e-9e5e-062ed53026cb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v8rsv" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.042332 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-shhjp" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.042907 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzxbk\" (UniqueName: \"kubernetes.io/projected/85bf6018-799b-4301-9b19-f68749c028b4-kube-api-access-rzxbk\") pod \"apiserver-76f77b778f-c655t\" (UID: \"85bf6018-799b-4301-9b19-f68749c028b4\") " pod="openshift-apiserver/apiserver-76f77b778f-c655t" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.057997 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-2gxkc"] Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.062968 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bbfeedc6-bc1b-4ea0-8e36-d97f8529c69f-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-7l8mf\" (UID: \"bbfeedc6-bc1b-4ea0-8e36-d97f8529c69f\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7l8mf" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.082814 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dvbjp" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.087548 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4dvpp"] Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.103221 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d999d\" (UniqueName: \"kubernetes.io/projected/f331d75b-ad6e-4adc-88be-379c031c7d22-kube-api-access-d999d\") pod \"cluster-image-registry-operator-dc59b4c8b-xwbdh\" (UID: \"f331d75b-ad6e-4adc-88be-379c031c7d22\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xwbdh" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.103629 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsd6c\" (UniqueName: \"kubernetes.io/projected/ae0dbcde-8d79-4c2a-8c0b-6e64c29d52d0-kube-api-access-wsd6c\") pod \"service-ca-operator-777779d784-h4sqq\" (UID: \"ae0dbcde-8d79-4c2a-8c0b-6e64c29d52d0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-h4sqq" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.140750 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7t7r7" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.142499 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqlkt\" (UniqueName: \"kubernetes.io/projected/44f5f1a9-edac-427c-b170-affcaa869772-kube-api-access-pqlkt\") pod \"openshift-config-operator-7777fb866f-mdzbj\" (UID: \"44f5f1a9-edac-427c-b170-affcaa869772\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-mdzbj" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.147647 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76zfr\" (UniqueName: \"kubernetes.io/projected/08e24326-1ea6-4ab6-bcac-c58f25a88358-kube-api-access-76zfr\") pod \"multus-admission-controller-857f4d67dd-vdc7f\" (UID: \"08e24326-1ea6-4ab6-bcac-c58f25a88358\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vdc7f" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.159962 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgtlf\" (UniqueName: \"kubernetes.io/projected/71c06d34-b31d-47d7-8323-510c5716530e-kube-api-access-hgtlf\") pod \"catalog-operator-68c6474976-5vkz8\" (UID: \"71c06d34-b31d-47d7-8323-510c5716530e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5vkz8" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.169200 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7l8mf" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.180664 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f331d75b-ad6e-4adc-88be-379c031c7d22-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-xwbdh\" (UID: \"f331d75b-ad6e-4adc-88be-379c031c7d22\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xwbdh" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.184373 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v8rsv" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.204415 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vdnv\" (UniqueName: \"kubernetes.io/projected/d9fca12a-5420-4c0f-90fc-05333ca3353f-kube-api-access-4vdnv\") pod \"machine-approver-56656f9798-nl7ht\" (UID: \"d9fca12a-5420-4c0f-90fc-05333ca3353f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-nl7ht" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.221738 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgmrz\" (UniqueName: \"kubernetes.io/projected/28796a4d-fde0-4f6e-9a06-8f72bfba6473-kube-api-access-rgmrz\") pod \"controller-manager-879f6c89f-9pj89\" (UID: \"28796a4d-fde0-4f6e-9a06-8f72bfba6473\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9pj89" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.229146 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mdzbj" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.256346 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-fqf9j" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.287264 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nfp7k" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.298462 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-c655t" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.303356 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8098591-7b9f-4330-90f0-4181570d05b3-config-volume\") pod \"collect-profiles-29396400-cqt2c\" (UID: \"a8098591-7b9f-4330-90f0-4181570d05b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-cqt2c" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.303442 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/7343df3b-7616-42dd-8e27-5f9a2031a8d9-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-w5jfl\" (UID: \"7343df3b-7616-42dd-8e27-5f9a2031a8d9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w5jfl" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.303472 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d9944e3b-40c6-48c2-8ccc-c38eb2304e63-trusted-ca\") pod \"console-operator-58897d9998-8jvnn\" (UID: \"d9944e3b-40c6-48c2-8ccc-c38eb2304e63\") " pod="openshift-console-operator/console-operator-58897d9998-8jvnn" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.303535 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5712615f-2791-42fe-9a50-3dafe99495a0-trusted-ca\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.303575 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11ed22c0-5989-4ddf-b4fb-290cad527cbc-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-fxd9l\" (UID: \"11ed22c0-5989-4ddf-b4fb-290cad527cbc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fxd9l" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.303612 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29pmt\" (UniqueName: \"kubernetes.io/projected/9b5be81b-fbbe-4991-b218-53b76f364917-kube-api-access-29pmt\") pod \"machine-config-server-tm7js\" (UID: \"9b5be81b-fbbe-4991-b218-53b76f364917\") " pod="openshift-machine-config-operator/machine-config-server-tm7js" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.303661 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsgtk\" (UniqueName: \"kubernetes.io/projected/11ed22c0-5989-4ddf-b4fb-290cad527cbc-kube-api-access-fsgtk\") pod \"kube-storage-version-migrator-operator-b67b599dd-fxd9l\" (UID: \"11ed22c0-5989-4ddf-b4fb-290cad527cbc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fxd9l" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.303707 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7dcf8242-11fe-41bf-babf-825d87eabd70-metrics-tls\") pod \"dns-operator-744455d44c-lw7n2\" (UID: \"7dcf8242-11fe-41bf-babf-825d87eabd70\") " pod="openshift-dns-operator/dns-operator-744455d44c-lw7n2" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.303773 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9944e3b-40c6-48c2-8ccc-c38eb2304e63-config\") pod \"console-operator-58897d9998-8jvnn\" (UID: \"d9944e3b-40c6-48c2-8ccc-c38eb2304e63\") " pod="openshift-console-operator/console-operator-58897d9998-8jvnn" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.303876 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nxrj\" (UniqueName: \"kubernetes.io/projected/7dcf8242-11fe-41bf-babf-825d87eabd70-kube-api-access-5nxrj\") pod \"dns-operator-744455d44c-lw7n2\" (UID: \"7dcf8242-11fe-41bf-babf-825d87eabd70\") " pod="openshift-dns-operator/dns-operator-744455d44c-lw7n2" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.303988 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndv2n\" (UniqueName: \"kubernetes.io/projected/d9944e3b-40c6-48c2-8ccc-c38eb2304e63-kube-api-access-ndv2n\") pod \"console-operator-58897d9998-8jvnn\" (UID: \"d9944e3b-40c6-48c2-8ccc-c38eb2304e63\") " pod="openshift-console-operator/console-operator-58897d9998-8jvnn" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.304018 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/9b5be81b-fbbe-4991-b218-53b76f364917-node-bootstrap-token\") pod \"machine-config-server-tm7js\" (UID: \"9b5be81b-fbbe-4991-b218-53b76f364917\") " pod="openshift-machine-config-operator/machine-config-server-tm7js" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.304099 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d4jd\" (UniqueName: \"kubernetes.io/projected/5712615f-2791-42fe-9a50-3dafe99495a0-kube-api-access-7d4jd\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.304185 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5712615f-2791-42fe-9a50-3dafe99495a0-ca-trust-extracted\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.304227 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5712615f-2791-42fe-9a50-3dafe99495a0-registry-certificates\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.304281 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/9b5be81b-fbbe-4991-b218-53b76f364917-certs\") pod \"machine-config-server-tm7js\" (UID: \"9b5be81b-fbbe-4991-b218-53b76f364917\") " pod="openshift-machine-config-operator/machine-config-server-tm7js" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.304463 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a8098591-7b9f-4330-90f0-4181570d05b3-secret-volume\") pod \"collect-profiles-29396400-cqt2c\" (UID: \"a8098591-7b9f-4330-90f0-4181570d05b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-cqt2c" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.304496 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5712615f-2791-42fe-9a50-3dafe99495a0-bound-sa-token\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.304536 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glx9s\" (UniqueName: \"kubernetes.io/projected/7343df3b-7616-42dd-8e27-5f9a2031a8d9-kube-api-access-glx9s\") pod \"control-plane-machine-set-operator-78cbb6b69f-w5jfl\" (UID: \"7343df3b-7616-42dd-8e27-5f9a2031a8d9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w5jfl" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.304558 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9944e3b-40c6-48c2-8ccc-c38eb2304e63-serving-cert\") pod \"console-operator-58897d9998-8jvnn\" (UID: \"d9944e3b-40c6-48c2-8ccc-c38eb2304e63\") " pod="openshift-console-operator/console-operator-58897d9998-8jvnn" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.304612 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5712615f-2791-42fe-9a50-3dafe99495a0-installation-pull-secrets\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.304632 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9w2mf\" (UniqueName: \"kubernetes.io/projected/a8098591-7b9f-4330-90f0-4181570d05b3-kube-api-access-9w2mf\") pod \"collect-profiles-29396400-cqt2c\" (UID: \"a8098591-7b9f-4330-90f0-4181570d05b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-cqt2c" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.304673 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11ed22c0-5989-4ddf-b4fb-290cad527cbc-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-fxd9l\" (UID: \"11ed22c0-5989-4ddf-b4fb-290cad527cbc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fxd9l" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.304699 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.304719 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5712615f-2791-42fe-9a50-3dafe99495a0-registry-tls\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:02 crc kubenswrapper[4699]: E1122 04:10:02.306762 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:10:02.806746937 +0000 UTC m=+154.149368124 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4f67" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.319259 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-vdc7f" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.344492 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5vkz8" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.362803 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-h4sqq" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.408234 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-nl7ht" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.412100 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:10:02 crc kubenswrapper[4699]: E1122 04:10:02.412344 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:10:02.912315263 +0000 UTC m=+154.254936440 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.413475 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a8098591-7b9f-4330-90f0-4181570d05b3-secret-volume\") pod \"collect-profiles-29396400-cqt2c\" (UID: \"a8098591-7b9f-4330-90f0-4181570d05b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-cqt2c" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.414042 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5712615f-2791-42fe-9a50-3dafe99495a0-bound-sa-token\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.414121 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glx9s\" (UniqueName: \"kubernetes.io/projected/7343df3b-7616-42dd-8e27-5f9a2031a8d9-kube-api-access-glx9s\") pod \"control-plane-machine-set-operator-78cbb6b69f-w5jfl\" (UID: \"7343df3b-7616-42dd-8e27-5f9a2031a8d9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w5jfl" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.414197 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9944e3b-40c6-48c2-8ccc-c38eb2304e63-serving-cert\") pod \"console-operator-58897d9998-8jvnn\" (UID: \"d9944e3b-40c6-48c2-8ccc-c38eb2304e63\") " pod="openshift-console-operator/console-operator-58897d9998-8jvnn" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.414300 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5712615f-2791-42fe-9a50-3dafe99495a0-installation-pull-secrets\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.414334 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9w2mf\" (UniqueName: \"kubernetes.io/projected/a8098591-7b9f-4330-90f0-4181570d05b3-kube-api-access-9w2mf\") pod \"collect-profiles-29396400-cqt2c\" (UID: \"a8098591-7b9f-4330-90f0-4181570d05b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-cqt2c" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.414396 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11ed22c0-5989-4ddf-b4fb-290cad527cbc-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-fxd9l\" (UID: \"11ed22c0-5989-4ddf-b4fb-290cad527cbc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fxd9l" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.414472 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.414515 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stczf\" (UniqueName: \"kubernetes.io/projected/6128e348-e15b-4e9b-adc7-851ae384ec4d-kube-api-access-stczf\") pod \"dns-default-4s69k\" (UID: \"6128e348-e15b-4e9b-adc7-851ae384ec4d\") " pod="openshift-dns/dns-default-4s69k" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.414555 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5712615f-2791-42fe-9a50-3dafe99495a0-registry-tls\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.414652 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4cecc497-1936-439e-ab82-eeaa6bf9dc0b-cert\") pod \"ingress-canary-x4855\" (UID: \"4cecc497-1936-439e-ab82-eeaa6bf9dc0b\") " pod="openshift-ingress-canary/ingress-canary-x4855" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.414719 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lhnd\" (UniqueName: \"kubernetes.io/projected/65188bec-6189-4789-9b29-f8241a81302e-kube-api-access-7lhnd\") pod \"csi-hostpathplugin-gmb69\" (UID: \"65188bec-6189-4789-9b29-f8241a81302e\") " pod="hostpath-provisioner/csi-hostpathplugin-gmb69" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.414825 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/65188bec-6189-4789-9b29-f8241a81302e-socket-dir\") pod \"csi-hostpathplugin-gmb69\" (UID: \"65188bec-6189-4789-9b29-f8241a81302e\") " pod="hostpath-provisioner/csi-hostpathplugin-gmb69" Nov 22 04:10:02 crc kubenswrapper[4699]: E1122 04:10:02.415714 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:10:02.915518151 +0000 UTC m=+154.258139338 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4f67" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.415824 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8098591-7b9f-4330-90f0-4181570d05b3-config-volume\") pod \"collect-profiles-29396400-cqt2c\" (UID: \"a8098591-7b9f-4330-90f0-4181570d05b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-cqt2c" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.426062 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/7343df3b-7616-42dd-8e27-5f9a2031a8d9-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-w5jfl\" (UID: \"7343df3b-7616-42dd-8e27-5f9a2031a8d9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w5jfl" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.426114 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d9944e3b-40c6-48c2-8ccc-c38eb2304e63-trusted-ca\") pod \"console-operator-58897d9998-8jvnn\" (UID: \"d9944e3b-40c6-48c2-8ccc-c38eb2304e63\") " pod="openshift-console-operator/console-operator-58897d9998-8jvnn" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.427209 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8098591-7b9f-4330-90f0-4181570d05b3-config-volume\") pod \"collect-profiles-29396400-cqt2c\" (UID: \"a8098591-7b9f-4330-90f0-4181570d05b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-cqt2c" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.428143 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5712615f-2791-42fe-9a50-3dafe99495a0-trusted-ca\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.431260 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d9944e3b-40c6-48c2-8ccc-c38eb2304e63-trusted-ca\") pod \"console-operator-58897d9998-8jvnn\" (UID: \"d9944e3b-40c6-48c2-8ccc-c38eb2304e63\") " pod="openshift-console-operator/console-operator-58897d9998-8jvnn" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.432417 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9944e3b-40c6-48c2-8ccc-c38eb2304e63-serving-cert\") pod \"console-operator-58897d9998-8jvnn\" (UID: \"d9944e3b-40c6-48c2-8ccc-c38eb2304e63\") " pod="openshift-console-operator/console-operator-58897d9998-8jvnn" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.432709 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11ed22c0-5989-4ddf-b4fb-290cad527cbc-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-fxd9l\" (UID: \"11ed22c0-5989-4ddf-b4fb-290cad527cbc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fxd9l" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.432883 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29pmt\" (UniqueName: \"kubernetes.io/projected/9b5be81b-fbbe-4991-b218-53b76f364917-kube-api-access-29pmt\") pod \"machine-config-server-tm7js\" (UID: \"9b5be81b-fbbe-4991-b218-53b76f364917\") " pod="openshift-machine-config-operator/machine-config-server-tm7js" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.433276 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11ed22c0-5989-4ddf-b4fb-290cad527cbc-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-fxd9l\" (UID: \"11ed22c0-5989-4ddf-b4fb-290cad527cbc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fxd9l" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.433720 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsgtk\" (UniqueName: \"kubernetes.io/projected/11ed22c0-5989-4ddf-b4fb-290cad527cbc-kube-api-access-fsgtk\") pod \"kube-storage-version-migrator-operator-b67b599dd-fxd9l\" (UID: \"11ed22c0-5989-4ddf-b4fb-290cad527cbc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fxd9l" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.433796 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7dcf8242-11fe-41bf-babf-825d87eabd70-metrics-tls\") pod \"dns-operator-744455d44c-lw7n2\" (UID: \"7dcf8242-11fe-41bf-babf-825d87eabd70\") " pod="openshift-dns-operator/dns-operator-744455d44c-lw7n2" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.433860 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/65188bec-6189-4789-9b29-f8241a81302e-plugins-dir\") pod \"csi-hostpathplugin-gmb69\" (UID: \"65188bec-6189-4789-9b29-f8241a81302e\") " pod="hostpath-provisioner/csi-hostpathplugin-gmb69" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.433875 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11ed22c0-5989-4ddf-b4fb-290cad527cbc-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-fxd9l\" (UID: \"11ed22c0-5989-4ddf-b4fb-290cad527cbc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fxd9l" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.433883 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9944e3b-40c6-48c2-8ccc-c38eb2304e63-config\") pod \"console-operator-58897d9998-8jvnn\" (UID: \"d9944e3b-40c6-48c2-8ccc-c38eb2304e63\") " pod="openshift-console-operator/console-operator-58897d9998-8jvnn" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.433971 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nxrj\" (UniqueName: \"kubernetes.io/projected/7dcf8242-11fe-41bf-babf-825d87eabd70-kube-api-access-5nxrj\") pod \"dns-operator-744455d44c-lw7n2\" (UID: \"7dcf8242-11fe-41bf-babf-825d87eabd70\") " pod="openshift-dns-operator/dns-operator-744455d44c-lw7n2" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.433997 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndv2n\" (UniqueName: \"kubernetes.io/projected/d9944e3b-40c6-48c2-8ccc-c38eb2304e63-kube-api-access-ndv2n\") pod \"console-operator-58897d9998-8jvnn\" (UID: \"d9944e3b-40c6-48c2-8ccc-c38eb2304e63\") " pod="openshift-console-operator/console-operator-58897d9998-8jvnn" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.434022 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/9b5be81b-fbbe-4991-b218-53b76f364917-node-bootstrap-token\") pod \"machine-config-server-tm7js\" (UID: \"9b5be81b-fbbe-4991-b218-53b76f364917\") " pod="openshift-machine-config-operator/machine-config-server-tm7js" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.434079 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/65188bec-6189-4789-9b29-f8241a81302e-mountpoint-dir\") pod \"csi-hostpathplugin-gmb69\" (UID: \"65188bec-6189-4789-9b29-f8241a81302e\") " pod="hostpath-provisioner/csi-hostpathplugin-gmb69" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.434117 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7d4jd\" (UniqueName: \"kubernetes.io/projected/5712615f-2791-42fe-9a50-3dafe99495a0-kube-api-access-7d4jd\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.434149 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5712615f-2791-42fe-9a50-3dafe99495a0-ca-trust-extracted\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.434210 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5712615f-2791-42fe-9a50-3dafe99495a0-registry-certificates\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.434249 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmgkq\" (UniqueName: \"kubernetes.io/projected/4cecc497-1936-439e-ab82-eeaa6bf9dc0b-kube-api-access-nmgkq\") pod \"ingress-canary-x4855\" (UID: \"4cecc497-1936-439e-ab82-eeaa6bf9dc0b\") " pod="openshift-ingress-canary/ingress-canary-x4855" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.434279 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/65188bec-6189-4789-9b29-f8241a81302e-csi-data-dir\") pod \"csi-hostpathplugin-gmb69\" (UID: \"65188bec-6189-4789-9b29-f8241a81302e\") " pod="hostpath-provisioner/csi-hostpathplugin-gmb69" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.434302 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/9b5be81b-fbbe-4991-b218-53b76f364917-certs\") pod \"machine-config-server-tm7js\" (UID: \"9b5be81b-fbbe-4991-b218-53b76f364917\") " pod="openshift-machine-config-operator/machine-config-server-tm7js" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.434323 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/65188bec-6189-4789-9b29-f8241a81302e-registration-dir\") pod \"csi-hostpathplugin-gmb69\" (UID: \"65188bec-6189-4789-9b29-f8241a81302e\") " pod="hostpath-provisioner/csi-hostpathplugin-gmb69" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.434355 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6128e348-e15b-4e9b-adc7-851ae384ec4d-metrics-tls\") pod \"dns-default-4s69k\" (UID: \"6128e348-e15b-4e9b-adc7-851ae384ec4d\") " pod="openshift-dns/dns-default-4s69k" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.434509 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6128e348-e15b-4e9b-adc7-851ae384ec4d-config-volume\") pod \"dns-default-4s69k\" (UID: \"6128e348-e15b-4e9b-adc7-851ae384ec4d\") " pod="openshift-dns/dns-default-4s69k" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.436055 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9944e3b-40c6-48c2-8ccc-c38eb2304e63-config\") pod \"console-operator-58897d9998-8jvnn\" (UID: \"d9944e3b-40c6-48c2-8ccc-c38eb2304e63\") " pod="openshift-console-operator/console-operator-58897d9998-8jvnn" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.436226 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5712615f-2791-42fe-9a50-3dafe99495a0-trusted-ca\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.436498 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5712615f-2791-42fe-9a50-3dafe99495a0-registry-certificates\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.437190 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5712615f-2791-42fe-9a50-3dafe99495a0-ca-trust-extracted\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.438690 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a8098591-7b9f-4330-90f0-4181570d05b3-secret-volume\") pod \"collect-profiles-29396400-cqt2c\" (UID: \"a8098591-7b9f-4330-90f0-4181570d05b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-cqt2c" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.439427 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-9sbmb"] Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.440933 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5712615f-2791-42fe-9a50-3dafe99495a0-installation-pull-secrets\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.442444 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5712615f-2791-42fe-9a50-3dafe99495a0-registry-tls\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.442589 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4dvpp" event={"ID":"3fca2cfb-d582-4dbb-ab4c-199316fce981","Type":"ContainerStarted","Data":"dc2b778e72f750da7f59b72e0d942d00238358a3d13c57d081a7de189c6811da"} Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.442725 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/7343df3b-7616-42dd-8e27-5f9a2031a8d9-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-w5jfl\" (UID: \"7343df3b-7616-42dd-8e27-5f9a2031a8d9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w5jfl" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.444346 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-842kh" event={"ID":"40353ee4-6a92-4e39-be6f-b8249f523e36","Type":"ContainerStarted","Data":"4016de1ea8c5c665e9ffd13c9f742052b1f18fb7409b9f00cea2adbbcd689d38"} Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.447203 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pbzdl" event={"ID":"8cba166d-e47a-4e78-ab6b-f5c986cc18f4","Type":"ContainerStarted","Data":"80859dcba69a9c4bd5e79f8bdb94f2ec13972450266be9b498cb271c3bb641f9"} Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.447232 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pbzdl" event={"ID":"8cba166d-e47a-4e78-ab6b-f5c986cc18f4","Type":"ContainerStarted","Data":"7867e7ee73a9c39ff0458b9b6f43e208572d97c96b21de69ee2972f5e12846ed"} Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.447507 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/9b5be81b-fbbe-4991-b218-53b76f364917-node-bootstrap-token\") pod \"machine-config-server-tm7js\" (UID: \"9b5be81b-fbbe-4991-b218-53b76f364917\") " pod="openshift-machine-config-operator/machine-config-server-tm7js" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.448069 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/9b5be81b-fbbe-4991-b218-53b76f364917-certs\") pod \"machine-config-server-tm7js\" (UID: \"9b5be81b-fbbe-4991-b218-53b76f364917\") " pod="openshift-machine-config-operator/machine-config-server-tm7js" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.448396 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5712615f-2791-42fe-9a50-3dafe99495a0-bound-sa-token\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.448907 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" event={"ID":"b3173c9e-b8ff-4407-bb12-660219ce7a55","Type":"ContainerStarted","Data":"1835565d9fd290b7c44c539b1e119eccb5dd882b7419a7ae76fd5ff6f514a514"} Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.448964 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" event={"ID":"b3173c9e-b8ff-4407-bb12-660219ce7a55","Type":"ContainerStarted","Data":"d9cb1e62be0b7f3a55fdf4d3c1dea04bf1262a6e610949ba35b8e78e86838527"} Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.451039 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7dcf8242-11fe-41bf-babf-825d87eabd70-metrics-tls\") pod \"dns-operator-744455d44c-lw7n2\" (UID: \"7dcf8242-11fe-41bf-babf-825d87eabd70\") " pod="openshift-dns-operator/dns-operator-744455d44c-lw7n2" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.451110 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.453594 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-2gxkc" event={"ID":"854975d3-e251-4354-a2c8-84ea7da296a8","Type":"ContainerStarted","Data":"d8d44b8acfd0a76aa6ffd98baa69e6e23459b355d34a88bd8077913d8327fc53"} Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.461691 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glx9s\" (UniqueName: \"kubernetes.io/projected/7343df3b-7616-42dd-8e27-5f9a2031a8d9-kube-api-access-glx9s\") pod \"control-plane-machine-set-operator-78cbb6b69f-w5jfl\" (UID: \"7343df3b-7616-42dd-8e27-5f9a2031a8d9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w5jfl" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.476577 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9pj89" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.480735 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xwbdh" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.491865 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9w2mf\" (UniqueName: \"kubernetes.io/projected/a8098591-7b9f-4330-90f0-4181570d05b3-kube-api-access-9w2mf\") pod \"collect-profiles-29396400-cqt2c\" (UID: \"a8098591-7b9f-4330-90f0-4181570d05b3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-cqt2c" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.524112 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29pmt\" (UniqueName: \"kubernetes.io/projected/9b5be81b-fbbe-4991-b218-53b76f364917-kube-api-access-29pmt\") pod \"machine-config-server-tm7js\" (UID: \"9b5be81b-fbbe-4991-b218-53b76f364917\") " pod="openshift-machine-config-operator/machine-config-server-tm7js" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.536650 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.536934 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stczf\" (UniqueName: \"kubernetes.io/projected/6128e348-e15b-4e9b-adc7-851ae384ec4d-kube-api-access-stczf\") pod \"dns-default-4s69k\" (UID: \"6128e348-e15b-4e9b-adc7-851ae384ec4d\") " pod="openshift-dns/dns-default-4s69k" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.536984 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4cecc497-1936-439e-ab82-eeaa6bf9dc0b-cert\") pod \"ingress-canary-x4855\" (UID: \"4cecc497-1936-439e-ab82-eeaa6bf9dc0b\") " pod="openshift-ingress-canary/ingress-canary-x4855" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.537006 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/65188bec-6189-4789-9b29-f8241a81302e-socket-dir\") pod \"csi-hostpathplugin-gmb69\" (UID: \"65188bec-6189-4789-9b29-f8241a81302e\") " pod="hostpath-provisioner/csi-hostpathplugin-gmb69" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.537025 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lhnd\" (UniqueName: \"kubernetes.io/projected/65188bec-6189-4789-9b29-f8241a81302e-kube-api-access-7lhnd\") pod \"csi-hostpathplugin-gmb69\" (UID: \"65188bec-6189-4789-9b29-f8241a81302e\") " pod="hostpath-provisioner/csi-hostpathplugin-gmb69" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.537125 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/65188bec-6189-4789-9b29-f8241a81302e-plugins-dir\") pod \"csi-hostpathplugin-gmb69\" (UID: \"65188bec-6189-4789-9b29-f8241a81302e\") " pod="hostpath-provisioner/csi-hostpathplugin-gmb69" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.537218 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/65188bec-6189-4789-9b29-f8241a81302e-mountpoint-dir\") pod \"csi-hostpathplugin-gmb69\" (UID: \"65188bec-6189-4789-9b29-f8241a81302e\") " pod="hostpath-provisioner/csi-hostpathplugin-gmb69" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.537258 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/65188bec-6189-4789-9b29-f8241a81302e-csi-data-dir\") pod \"csi-hostpathplugin-gmb69\" (UID: \"65188bec-6189-4789-9b29-f8241a81302e\") " pod="hostpath-provisioner/csi-hostpathplugin-gmb69" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.537280 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmgkq\" (UniqueName: \"kubernetes.io/projected/4cecc497-1936-439e-ab82-eeaa6bf9dc0b-kube-api-access-nmgkq\") pod \"ingress-canary-x4855\" (UID: \"4cecc497-1936-439e-ab82-eeaa6bf9dc0b\") " pod="openshift-ingress-canary/ingress-canary-x4855" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.537302 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/65188bec-6189-4789-9b29-f8241a81302e-registration-dir\") pod \"csi-hostpathplugin-gmb69\" (UID: \"65188bec-6189-4789-9b29-f8241a81302e\") " pod="hostpath-provisioner/csi-hostpathplugin-gmb69" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.537329 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6128e348-e15b-4e9b-adc7-851ae384ec4d-metrics-tls\") pod \"dns-default-4s69k\" (UID: \"6128e348-e15b-4e9b-adc7-851ae384ec4d\") " pod="openshift-dns/dns-default-4s69k" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.537358 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6128e348-e15b-4e9b-adc7-851ae384ec4d-config-volume\") pod \"dns-default-4s69k\" (UID: \"6128e348-e15b-4e9b-adc7-851ae384ec4d\") " pod="openshift-dns/dns-default-4s69k" Nov 22 04:10:02 crc kubenswrapper[4699]: E1122 04:10:02.538124 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:10:03.038098636 +0000 UTC m=+154.380719973 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.538817 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/65188bec-6189-4789-9b29-f8241a81302e-plugins-dir\") pod \"csi-hostpathplugin-gmb69\" (UID: \"65188bec-6189-4789-9b29-f8241a81302e\") " pod="hostpath-provisioner/csi-hostpathplugin-gmb69" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.541715 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/65188bec-6189-4789-9b29-f8241a81302e-mountpoint-dir\") pod \"csi-hostpathplugin-gmb69\" (UID: \"65188bec-6189-4789-9b29-f8241a81302e\") " pod="hostpath-provisioner/csi-hostpathplugin-gmb69" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.541779 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/65188bec-6189-4789-9b29-f8241a81302e-registration-dir\") pod \"csi-hostpathplugin-gmb69\" (UID: \"65188bec-6189-4789-9b29-f8241a81302e\") " pod="hostpath-provisioner/csi-hostpathplugin-gmb69" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.541953 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/65188bec-6189-4789-9b29-f8241a81302e-csi-data-dir\") pod \"csi-hostpathplugin-gmb69\" (UID: \"65188bec-6189-4789-9b29-f8241a81302e\") " pod="hostpath-provisioner/csi-hostpathplugin-gmb69" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.543232 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6128e348-e15b-4e9b-adc7-851ae384ec4d-config-volume\") pod \"dns-default-4s69k\" (UID: \"6128e348-e15b-4e9b-adc7-851ae384ec4d\") " pod="openshift-dns/dns-default-4s69k" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.543403 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/65188bec-6189-4789-9b29-f8241a81302e-socket-dir\") pod \"csi-hostpathplugin-gmb69\" (UID: \"65188bec-6189-4789-9b29-f8241a81302e\") " pod="hostpath-provisioner/csi-hostpathplugin-gmb69" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.543748 4699 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-qc8mt container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.14:6443/healthz\": dial tcp 10.217.0.14:6443: connect: connection refused" start-of-body= Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.543785 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" podUID="b3173c9e-b8ff-4407-bb12-660219ce7a55" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.14:6443/healthz\": dial tcp 10.217.0.14:6443: connect: connection refused" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.549972 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4cecc497-1936-439e-ab82-eeaa6bf9dc0b-cert\") pod \"ingress-canary-x4855\" (UID: \"4cecc497-1936-439e-ab82-eeaa6bf9dc0b\") " pod="openshift-ingress-canary/ingress-canary-x4855" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.550532 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsgtk\" (UniqueName: \"kubernetes.io/projected/11ed22c0-5989-4ddf-b4fb-290cad527cbc-kube-api-access-fsgtk\") pod \"kube-storage-version-migrator-operator-b67b599dd-fxd9l\" (UID: \"11ed22c0-5989-4ddf-b4fb-290cad527cbc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fxd9l" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.553698 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6128e348-e15b-4e9b-adc7-851ae384ec4d-metrics-tls\") pod \"dns-default-4s69k\" (UID: \"6128e348-e15b-4e9b-adc7-851ae384ec4d\") " pod="openshift-dns/dns-default-4s69k" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.558626 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jtp76"] Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.573860 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b66nb"] Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.586911 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7d4jd\" (UniqueName: \"kubernetes.io/projected/5712615f-2791-42fe-9a50-3dafe99495a0-kube-api-access-7d4jd\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.598388 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kqmwr"] Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.611586 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w5jfl" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.614078 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fxd9l" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.625340 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nxrj\" (UniqueName: \"kubernetes.io/projected/7dcf8242-11fe-41bf-babf-825d87eabd70-kube-api-access-5nxrj\") pod \"dns-operator-744455d44c-lw7n2\" (UID: \"7dcf8242-11fe-41bf-babf-825d87eabd70\") " pod="openshift-dns-operator/dns-operator-744455d44c-lw7n2" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.640223 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.641021 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndv2n\" (UniqueName: \"kubernetes.io/projected/d9944e3b-40c6-48c2-8ccc-c38eb2304e63-kube-api-access-ndv2n\") pod \"console-operator-58897d9998-8jvnn\" (UID: \"d9944e3b-40c6-48c2-8ccc-c38eb2304e63\") " pod="openshift-console-operator/console-operator-58897d9998-8jvnn" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.644189 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stczf\" (UniqueName: \"kubernetes.io/projected/6128e348-e15b-4e9b-adc7-851ae384ec4d-kube-api-access-stczf\") pod \"dns-default-4s69k\" (UID: \"6128e348-e15b-4e9b-adc7-851ae384ec4d\") " pod="openshift-dns/dns-default-4s69k" Nov 22 04:10:02 crc kubenswrapper[4699]: E1122 04:10:02.654206 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:10:03.15416457 +0000 UTC m=+154.496785767 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4f67" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.661609 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmgkq\" (UniqueName: \"kubernetes.io/projected/4cecc497-1936-439e-ab82-eeaa6bf9dc0b-kube-api-access-nmgkq\") pod \"ingress-canary-x4855\" (UID: \"4cecc497-1936-439e-ab82-eeaa6bf9dc0b\") " pod="openshift-ingress-canary/ingress-canary-x4855" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.670500 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-cqt2c" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.687037 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-tm7js" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.687803 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lhnd\" (UniqueName: \"kubernetes.io/projected/65188bec-6189-4789-9b29-f8241a81302e-kube-api-access-7lhnd\") pod \"csi-hostpathplugin-gmb69\" (UID: \"65188bec-6189-4789-9b29-f8241a81302e\") " pod="hostpath-provisioner/csi-hostpathplugin-gmb69" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.712407 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-gmb69" Nov 22 04:10:02 crc kubenswrapper[4699]: W1122 04:10:02.727757 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86e2f056_f89f_40db_a6e1_336632aa9afe.slice/crio-8f0eca1e3d07a2f37653d8dea06210a0924891b9e7904517d720debbb3cd903e WatchSource:0}: Error finding container 8f0eca1e3d07a2f37653d8dea06210a0924891b9e7904517d720debbb3cd903e: Status 404 returned error can't find the container with id 8f0eca1e3d07a2f37653d8dea06210a0924891b9e7904517d720debbb3cd903e Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.728284 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4s69k" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.733782 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-x4855" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.742449 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:10:02 crc kubenswrapper[4699]: E1122 04:10:02.742904 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:10:03.242888132 +0000 UTC m=+154.585509319 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.807526 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-8jvnn" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.817719 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-n4cff"] Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.847722 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:02 crc kubenswrapper[4699]: E1122 04:10:02.848067 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:10:03.348052748 +0000 UTC m=+154.690673935 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4f67" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.848173 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-lw7n2" Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.874015 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dpfg2"] Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.874070 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5nbqq"] Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.951522 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:10:02 crc kubenswrapper[4699]: E1122 04:10:02.951904 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:10:03.451877331 +0000 UTC m=+154.794498518 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.952014 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:02 crc kubenswrapper[4699]: E1122 04:10:02.952380 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:10:03.452374023 +0000 UTC m=+154.794995200 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4f67" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.961848 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dvbjp"] Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.987572 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-dpsxh"] Nov 22 04:10:02 crc kubenswrapper[4699]: I1122 04:10:02.987794 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-jlk5m"] Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.053562 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:10:03 crc kubenswrapper[4699]: E1122 04:10:03.053768 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:10:03.553741246 +0000 UTC m=+154.896362433 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.054802 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:03 crc kubenswrapper[4699]: E1122 04:10:03.055275 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:10:03.555256283 +0000 UTC m=+154.897877470 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4f67" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.155884 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:10:03 crc kubenswrapper[4699]: E1122 04:10:03.156086 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:10:03.656050522 +0000 UTC m=+154.998671709 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.156176 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:03 crc kubenswrapper[4699]: E1122 04:10:03.156755 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:10:03.656740949 +0000 UTC m=+154.999362136 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4f67" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.240781 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-8v9k5"] Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.257619 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:10:03 crc kubenswrapper[4699]: E1122 04:10:03.257978 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:10:03.757958818 +0000 UTC m=+155.100580005 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.271123 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-tnb4t"] Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.289564 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7l8mf"] Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.360767 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:03 crc kubenswrapper[4699]: E1122 04:10:03.361184 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:10:03.861168926 +0000 UTC m=+155.203790113 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4f67" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.385465 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pbzdl" podStartSLOduration=132.385445223 podStartE2EDuration="2m12.385445223s" podCreationTimestamp="2025-11-22 04:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:10:03.341918943 +0000 UTC m=+154.684540140" watchObservedRunningTime="2025-11-22 04:10:03.385445223 +0000 UTC m=+154.728066410" Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.386706 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rpts" podStartSLOduration=132.386700404 podStartE2EDuration="2m12.386700404s" podCreationTimestamp="2025-11-22 04:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:10:03.384633783 +0000 UTC m=+154.727254980" watchObservedRunningTime="2025-11-22 04:10:03.386700404 +0000 UTC m=+154.729321591" Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.461579 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:10:03 crc kubenswrapper[4699]: E1122 04:10:03.462215 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:10:03.96219713 +0000 UTC m=+155.304818317 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.563248 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:03 crc kubenswrapper[4699]: E1122 04:10:03.563907 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:10:04.063892761 +0000 UTC m=+155.406513948 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4f67" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.619613 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4dvpp" event={"ID":"3fca2cfb-d582-4dbb-ab4c-199316fce981","Type":"ContainerStarted","Data":"51574152e38ba657d6b627db4abadc80e9a9c01d5555c833d66f75dee930cfbc"} Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.620082 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-4dvpp" Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.656415 4699 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4dvpp container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.656772 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-4dvpp" podUID="3fca2cfb-d582-4dbb-ab4c-199316fce981" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.656990 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-2gxkc" event={"ID":"854975d3-e251-4354-a2c8-84ea7da296a8","Type":"ContainerStarted","Data":"574b615ddf76549a88898ce1e15fb5f6fadad3b7587b72a33cf3c06445effbc6"} Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.660545 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-2gxkc" Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.662137 4699 patch_prober.go:28] interesting pod/downloads-7954f5f757-2gxkc container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.662175 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2gxkc" podUID="854975d3-e251-4354-a2c8-84ea7da296a8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.665477 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:10:03 crc kubenswrapper[4699]: E1122 04:10:03.666049 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:10:04.166031223 +0000 UTC m=+155.508652410 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.691122 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-9sbmb" event={"ID":"c108dbbb-24af-45d9-a01f-cadab889f225","Type":"ContainerStarted","Data":"ae10abcbeb546a531e55346e8f09ec2bc5b329a992f1a29d0dfdf6f5cc5ed9d2"} Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.691188 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-9sbmb" event={"ID":"c108dbbb-24af-45d9-a01f-cadab889f225","Type":"ContainerStarted","Data":"cae099890687377c90c6b3e1e0be3842cbcb38f09e86dc592576b566c8c06eba"} Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.693369 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v8rsv"] Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.703948 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-fqf9j"] Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.705676 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-n4cff" event={"ID":"4e4eacc6-98af-4a4d-a161-a8629b46c1ef","Type":"ContainerStarted","Data":"222a59f66404917604e4fc9b41adaff74498a27d277275747e5c3577fb375427"} Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.715742 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tnb4t" event={"ID":"3b82b9e9-6fee-42bd-8cdd-3bacf580f98e","Type":"ContainerStarted","Data":"e67805cdac225e77fa7adf1c9cff4316d02bc06801d79751cfa05968ead98454"} Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.719674 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5vkz8"] Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.735498 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dpfg2" event={"ID":"bfa00eae-58d2-4c7f-b232-16d1edff80f0","Type":"ContainerStarted","Data":"76e7789171b6e8fba458c893de06a645544a1749c9642c8e71cb2265be20815f"} Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.740709 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-tm7js" event={"ID":"9b5be81b-fbbe-4991-b218-53b76f364917","Type":"ContainerStarted","Data":"a7adf583c7ead3e90859a67f616f06b973ca3948052b81eee2dd24a56c80cd41"} Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.740751 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-tm7js" event={"ID":"9b5be81b-fbbe-4991-b218-53b76f364917","Type":"ContainerStarted","Data":"e16f568a5b889b265c88d363bfaa649036208beae3c9332c3bcdc56d44f9ef41"} Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.750926 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kqmwr" event={"ID":"2901096e-6b5f-4a68-a8e4-fa91ff0575fd","Type":"ContainerStarted","Data":"2e3ebff5583a415ae71039075f6b7effb801ba04f653e95b3b178d2178e61286"} Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.750974 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kqmwr" event={"ID":"2901096e-6b5f-4a68-a8e4-fa91ff0575fd","Type":"ContainerStarted","Data":"3072cd6ba29a7e02b0b1e330a7b0438d773a5a018c7360fd76e23b4dbca0d644"} Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.759661 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-5nbqq" event={"ID":"c93ff986-8043-4c13-b1a1-d24305361338","Type":"ContainerStarted","Data":"187b187717f5c904b412822116ceded1a5f3abead6b546b05123472d5a0ace2e"} Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.769983 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:03 crc kubenswrapper[4699]: E1122 04:10:03.773160 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:10:04.273145767 +0000 UTC m=+155.615766954 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4f67" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.783503 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-nfp7k"] Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.794794 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-c655t"] Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.833446 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-shhjp"] Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.842261 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8v9k5" event={"ID":"b4be8553-5539-4124-8106-6e65ba593dad","Type":"ContainerStarted","Data":"85aa4df7553d12a440ce0f54c5a1e4e4329c940aaaa761abcaac8b05cea6c1ed"} Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.849906 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-jlk5m" event={"ID":"07ed42bd-25e2-43de-bbd7-431ab818b761","Type":"ContainerStarted","Data":"2522b957b3acd5cba1c17eddf6d8819e5ec65b1942b836eefd32922bd5e75df0"} Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.851126 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-mdzbj"] Nov 22 04:10:03 crc kubenswrapper[4699]: W1122 04:10:03.869123 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85bf6018_799b_4301_9b19_f68749c028b4.slice/crio-a3a7aa69e83a804178cfb1b4194cc8fbba055114589fef9d2d563cb079f54ad8 WatchSource:0}: Error finding container a3a7aa69e83a804178cfb1b4194cc8fbba055114589fef9d2d563cb079f54ad8: Status 404 returned error can't find the container with id a3a7aa69e83a804178cfb1b4194cc8fbba055114589fef9d2d563cb079f54ad8 Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.872324 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:10:03 crc kubenswrapper[4699]: E1122 04:10:03.873333 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:10:04.3733166 +0000 UTC m=+155.715937787 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.878641 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jtp76" event={"ID":"12ba713a-d675-4920-8367-8d6c6ccff834","Type":"ContainerStarted","Data":"c3c1a0a7a529c2045024e949aa4604ce17dda6b6642677cf02e319a29d7e2502"} Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.878694 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jtp76" event={"ID":"12ba713a-d675-4920-8367-8d6c6ccff834","Type":"ContainerStarted","Data":"2c417941b61d128ca271e5f9e10bc43e5240eee79dd1b40103e0b9df2553f1d0"} Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.879623 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jtp76" Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.891781 4699 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-jtp76 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.892259 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jtp76" podUID="12ba713a-d675-4920-8367-8d6c6ccff834" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.922122 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b66nb" event={"ID":"86e2f056-f89f-40db-a6e1-336632aa9afe","Type":"ContainerStarted","Data":"fe494ee7440b93f1e25973775a9ead8ecd0588da40ed9747022bbd31de6b6c45"} Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.922191 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b66nb" Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.922207 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b66nb" event={"ID":"86e2f056-f89f-40db-a6e1-336632aa9afe","Type":"ContainerStarted","Data":"8f0eca1e3d07a2f37653d8dea06210a0924891b9e7904517d720debbb3cd903e"} Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.927794 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-7t7r7"] Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.935713 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7l8mf" event={"ID":"bbfeedc6-bc1b-4ea0-8e36-d97f8529c69f","Type":"ContainerStarted","Data":"4fe38935f405b9ca40134f82c093bb714f3a5046c2a405d8faeb43d713138935"} Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.937500 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-nl7ht" event={"ID":"d9fca12a-5420-4c0f-90fc-05333ca3353f","Type":"ContainerStarted","Data":"86e8e5a97363b4f169d19a2497608da805ba818e0d089448111248dbd16348cd"} Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.937533 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-nl7ht" event={"ID":"d9fca12a-5420-4c0f-90fc-05333ca3353f","Type":"ContainerStarted","Data":"8d8c94495ed1f2913f4dba943770afd9968b4f588d49f73123a8cb3104a2d3b2"} Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.938703 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dvbjp" event={"ID":"a22f23a9-587d-4796-a25c-6563be7a2792","Type":"ContainerStarted","Data":"427eaa6d5478605a9d7fc039913d2c88ffbc8b0b7898d039ebd1992a57938982"} Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.944481 4699 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-b66nb container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.18:5443/healthz\": dial tcp 10.217.0.18:5443: connect: connection refused" start-of-body= Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.944549 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b66nb" podUID="86e2f056-f89f-40db-a6e1-336632aa9afe" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.18:5443/healthz\": dial tcp 10.217.0.18:5443: connect: connection refused" Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.952056 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-842kh" event={"ID":"40353ee4-6a92-4e39-be6f-b8249f523e36","Type":"ContainerStarted","Data":"54e071cad361f10c54637aff4a6cab6f8abe14f5d19c72823470b53f46baa57e"} Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.958094 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-dpsxh" event={"ID":"0a91cd3c-7e69-4122-b256-f2fb9a6e0f0d","Type":"ContainerStarted","Data":"be0f8ade0c3fbcfe3b7c42928958eaca9aec7247a6e1cea04b48c8175aa0d8c5"} Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.968700 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.975758 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:03 crc kubenswrapper[4699]: E1122 04:10:03.976562 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:10:04.476542979 +0000 UTC m=+155.819164166 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4f67" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.978580 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-842kh" Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.988944 4699 patch_prober.go:28] interesting pod/router-default-5444994796-842kh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 04:10:03 crc kubenswrapper[4699]: [-]has-synced failed: reason withheld Nov 22 04:10:03 crc kubenswrapper[4699]: [+]process-running ok Nov 22 04:10:03 crc kubenswrapper[4699]: healthz check failed Nov 22 04:10:03 crc kubenswrapper[4699]: I1122 04:10:03.989030 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-842kh" podUID="40353ee4-6a92-4e39-be6f-b8249f523e36" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 04:10:04 crc kubenswrapper[4699]: I1122 04:10:04.081537 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:10:04 crc kubenswrapper[4699]: E1122 04:10:04.083186 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:10:04.583161061 +0000 UTC m=+155.925782248 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:04 crc kubenswrapper[4699]: I1122 04:10:04.083990 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:04 crc kubenswrapper[4699]: E1122 04:10:04.089805 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:10:04.589787884 +0000 UTC m=+155.932409071 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4f67" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:04 crc kubenswrapper[4699]: I1122 04:10:04.115392 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-vdc7f"] Nov 22 04:10:04 crc kubenswrapper[4699]: I1122 04:10:04.123221 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9pj89"] Nov 22 04:10:04 crc kubenswrapper[4699]: I1122 04:10:04.141778 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-h4sqq"] Nov 22 04:10:04 crc kubenswrapper[4699]: I1122 04:10:04.150537 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xwbdh"] Nov 22 04:10:04 crc kubenswrapper[4699]: I1122 04:10:04.157325 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" podStartSLOduration=134.157291614 podStartE2EDuration="2m14.157291614s" podCreationTimestamp="2025-11-22 04:07:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:10:04.149997114 +0000 UTC m=+155.492618301" watchObservedRunningTime="2025-11-22 04:10:04.157291614 +0000 UTC m=+155.499912801" Nov 22 04:10:04 crc kubenswrapper[4699]: I1122 04:10:04.161771 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-gmb69"] Nov 22 04:10:04 crc kubenswrapper[4699]: I1122 04:10:04.168853 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-8jvnn"] Nov 22 04:10:04 crc kubenswrapper[4699]: I1122 04:10:04.177498 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396400-cqt2c"] Nov 22 04:10:04 crc kubenswrapper[4699]: W1122 04:10:04.186096 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf331d75b_ad6e_4adc_88be_379c031c7d22.slice/crio-6f469bbff8f1f313c6282367a1ff7592c734eb01f037afd81fccd25c9166962b WatchSource:0}: Error finding container 6f469bbff8f1f313c6282367a1ff7592c734eb01f037afd81fccd25c9166962b: Status 404 returned error can't find the container with id 6f469bbff8f1f313c6282367a1ff7592c734eb01f037afd81fccd25c9166962b Nov 22 04:10:04 crc kubenswrapper[4699]: I1122 04:10:04.187225 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:10:04 crc kubenswrapper[4699]: E1122 04:10:04.187810 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:10:04.687793294 +0000 UTC m=+156.030414481 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:04 crc kubenswrapper[4699]: I1122 04:10:04.261781 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-4dvpp" podStartSLOduration=133.261761483 podStartE2EDuration="2m13.261761483s" podCreationTimestamp="2025-11-22 04:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:10:04.25923534 +0000 UTC m=+155.601856527" watchObservedRunningTime="2025-11-22 04:10:04.261761483 +0000 UTC m=+155.604382670" Nov 22 04:10:04 crc kubenswrapper[4699]: I1122 04:10:04.289978 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:04 crc kubenswrapper[4699]: E1122 04:10:04.290407 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:10:04.790392777 +0000 UTC m=+156.133013964 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4f67" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:04 crc kubenswrapper[4699]: I1122 04:10:04.307621 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-842kh" podStartSLOduration=133.30760231 podStartE2EDuration="2m13.30760231s" podCreationTimestamp="2025-11-22 04:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:10:04.303423527 +0000 UTC m=+155.646044724" watchObservedRunningTime="2025-11-22 04:10:04.30760231 +0000 UTC m=+155.650223497" Nov 22 04:10:04 crc kubenswrapper[4699]: I1122 04:10:04.340477 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-x4855"] Nov 22 04:10:04 crc kubenswrapper[4699]: I1122 04:10:04.346454 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4s69k"] Nov 22 04:10:04 crc kubenswrapper[4699]: I1122 04:10:04.346666 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-2gxkc" podStartSLOduration=133.34663132 podStartE2EDuration="2m13.34663132s" podCreationTimestamp="2025-11-22 04:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:10:04.346296071 +0000 UTC m=+155.688917278" watchObservedRunningTime="2025-11-22 04:10:04.34663132 +0000 UTC m=+155.689252507" Nov 22 04:10:04 crc kubenswrapper[4699]: I1122 04:10:04.377298 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w5jfl"] Nov 22 04:10:04 crc kubenswrapper[4699]: I1122 04:10:04.380603 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-lw7n2"] Nov 22 04:10:04 crc kubenswrapper[4699]: I1122 04:10:04.399325 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:10:04 crc kubenswrapper[4699]: E1122 04:10:04.399806 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:10:04.899786057 +0000 UTC m=+156.242407244 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:04 crc kubenswrapper[4699]: I1122 04:10:04.402913 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-dpsxh" podStartSLOduration=134.402856752 podStartE2EDuration="2m14.402856752s" podCreationTimestamp="2025-11-22 04:07:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:10:04.398875324 +0000 UTC m=+155.741496521" watchObservedRunningTime="2025-11-22 04:10:04.402856752 +0000 UTC m=+155.745477939" Nov 22 04:10:04 crc kubenswrapper[4699]: I1122 04:10:04.423070 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kqmwr" podStartSLOduration=133.423046279 podStartE2EDuration="2m13.423046279s" podCreationTimestamp="2025-11-22 04:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:10:04.421338167 +0000 UTC m=+155.763959374" watchObservedRunningTime="2025-11-22 04:10:04.423046279 +0000 UTC m=+155.765667466" Nov 22 04:10:04 crc kubenswrapper[4699]: I1122 04:10:04.451379 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fxd9l"] Nov 22 04:10:04 crc kubenswrapper[4699]: I1122 04:10:04.469275 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dvbjp" podStartSLOduration=134.469251845 podStartE2EDuration="2m14.469251845s" podCreationTimestamp="2025-11-22 04:07:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:10:04.465603575 +0000 UTC m=+155.808224782" watchObservedRunningTime="2025-11-22 04:10:04.469251845 +0000 UTC m=+155.811873032" Nov 22 04:10:04 crc kubenswrapper[4699]: I1122 04:10:04.501344 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:04 crc kubenswrapper[4699]: E1122 04:10:04.502924 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:10:05.002900262 +0000 UTC m=+156.345521449 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4f67" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:04 crc kubenswrapper[4699]: I1122 04:10:04.514872 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b66nb" podStartSLOduration=133.514341204 podStartE2EDuration="2m13.514341204s" podCreationTimestamp="2025-11-22 04:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:10:04.50078332 +0000 UTC m=+155.843404507" watchObservedRunningTime="2025-11-22 04:10:04.514341204 +0000 UTC m=+155.856962381" Nov 22 04:10:04 crc kubenswrapper[4699]: W1122 04:10:04.546993 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7343df3b_7616_42dd_8e27_5f9a2031a8d9.slice/crio-7cbc7fed7dff32cc6bf54615049ca5a59084a2b3e71b3a9831130f706fbc5311 WatchSource:0}: Error finding container 7cbc7fed7dff32cc6bf54615049ca5a59084a2b3e71b3a9831130f706fbc5311: Status 404 returned error can't find the container with id 7cbc7fed7dff32cc6bf54615049ca5a59084a2b3e71b3a9831130f706fbc5311 Nov 22 04:10:04 crc kubenswrapper[4699]: I1122 04:10:04.549482 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-tm7js" podStartSLOduration=5.549468928 podStartE2EDuration="5.549468928s" podCreationTimestamp="2025-11-22 04:09:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:10:04.547258633 +0000 UTC m=+155.889879810" watchObservedRunningTime="2025-11-22 04:10:04.549468928 +0000 UTC m=+155.892090115" Nov 22 04:10:04 crc kubenswrapper[4699]: W1122 04:10:04.571779 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6128e348_e15b_4e9b_adc7_851ae384ec4d.slice/crio-a0f8bef8928ef1dfabf14467a35ed9e1c6e6eb01e5aad87c9a6e1ac0057b838d WatchSource:0}: Error finding container a0f8bef8928ef1dfabf14467a35ed9e1c6e6eb01e5aad87c9a6e1ac0057b838d: Status 404 returned error can't find the container with id a0f8bef8928ef1dfabf14467a35ed9e1c6e6eb01e5aad87c9a6e1ac0057b838d Nov 22 04:10:04 crc kubenswrapper[4699]: I1122 04:10:04.603495 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:10:04 crc kubenswrapper[4699]: E1122 04:10:04.604362 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:10:05.104325087 +0000 UTC m=+156.446946274 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:04 crc kubenswrapper[4699]: I1122 04:10:04.622383 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-9sbmb" podStartSLOduration=133.6223583 podStartE2EDuration="2m13.6223583s" podCreationTimestamp="2025-11-22 04:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:10:04.621899079 +0000 UTC m=+155.964520266" watchObservedRunningTime="2025-11-22 04:10:04.6223583 +0000 UTC m=+155.964979487" Nov 22 04:10:04 crc kubenswrapper[4699]: I1122 04:10:04.657946 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jtp76" podStartSLOduration=133.657925925 podStartE2EDuration="2m13.657925925s" podCreationTimestamp="2025-11-22 04:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:10:04.654809128 +0000 UTC m=+155.997430325" watchObservedRunningTime="2025-11-22 04:10:04.657925925 +0000 UTC m=+156.000547112" Nov 22 04:10:04 crc kubenswrapper[4699]: I1122 04:10:04.707569 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:04 crc kubenswrapper[4699]: E1122 04:10:04.708133 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:10:05.208118199 +0000 UTC m=+156.550739386 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4f67" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:04 crc kubenswrapper[4699]: I1122 04:10:04.809040 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:10:04 crc kubenswrapper[4699]: E1122 04:10:04.809454 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:10:05.309395799 +0000 UTC m=+156.652016986 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:04 crc kubenswrapper[4699]: I1122 04:10:04.809883 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:04 crc kubenswrapper[4699]: E1122 04:10:04.810269 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:10:05.31025457 +0000 UTC m=+156.652875757 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4f67" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:04 crc kubenswrapper[4699]: E1122 04:10:04.913987 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:10:05.413962721 +0000 UTC m=+156.756583918 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:04 crc kubenswrapper[4699]: I1122 04:10:04.917990 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:10:04 crc kubenswrapper[4699]: I1122 04:10:04.918553 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:04 crc kubenswrapper[4699]: E1122 04:10:04.919033 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:10:05.419018035 +0000 UTC m=+156.761639222 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4f67" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:04 crc kubenswrapper[4699]: I1122 04:10:04.978671 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dpfg2" event={"ID":"bfa00eae-58d2-4c7f-b232-16d1edff80f0","Type":"ContainerStarted","Data":"8cb7bc848baca628fec6529d5afdde5b0ac7e70ed0dfd4ffe378b18848a0348e"} Nov 22 04:10:04 crc kubenswrapper[4699]: I1122 04:10:04.978743 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dpfg2" event={"ID":"bfa00eae-58d2-4c7f-b232-16d1edff80f0","Type":"ContainerStarted","Data":"46f42846de4d52c8d82e9448d17b619780499badbc59b0dab64c158d94a75297"} Nov 22 04:10:04 crc kubenswrapper[4699]: I1122 04:10:04.978998 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dpfg2" Nov 22 04:10:04 crc kubenswrapper[4699]: I1122 04:10:04.985246 4699 patch_prober.go:28] interesting pod/router-default-5444994796-842kh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 04:10:04 crc kubenswrapper[4699]: [-]has-synced failed: reason withheld Nov 22 04:10:04 crc kubenswrapper[4699]: [+]process-running ok Nov 22 04:10:04 crc kubenswrapper[4699]: healthz check failed Nov 22 04:10:04 crc kubenswrapper[4699]: I1122 04:10:04.985319 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-842kh" podUID="40353ee4-6a92-4e39-be6f-b8249f523e36" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 04:10:04 crc kubenswrapper[4699]: I1122 04:10:04.985724 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5vkz8" event={"ID":"71c06d34-b31d-47d7-8323-510c5716530e","Type":"ContainerStarted","Data":"335168debb7b5be09110ead05c143c613b2ec5354292830619dccd64455d481b"} Nov 22 04:10:04 crc kubenswrapper[4699]: I1122 04:10:04.986828 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5vkz8" Nov 22 04:10:04 crc kubenswrapper[4699]: I1122 04:10:04.987122 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5vkz8" event={"ID":"71c06d34-b31d-47d7-8323-510c5716530e","Type":"ContainerStarted","Data":"f60cc29a7a690c51a936973b36f15cb509ad4f279cc843ff4dad369baa43a915"} Nov 22 04:10:04 crc kubenswrapper[4699]: I1122 04:10:04.994619 4699 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-5vkz8 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Nov 22 04:10:04 crc kubenswrapper[4699]: I1122 04:10:04.994688 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5vkz8" podUID="71c06d34-b31d-47d7-8323-510c5716530e" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" Nov 22 04:10:04 crc kubenswrapper[4699]: I1122 04:10:04.996627 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4s69k" event={"ID":"6128e348-e15b-4e9b-adc7-851ae384ec4d","Type":"ContainerStarted","Data":"a0f8bef8928ef1dfabf14467a35ed9e1c6e6eb01e5aad87c9a6e1ac0057b838d"} Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.000950 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-gmb69" event={"ID":"65188bec-6189-4789-9b29-f8241a81302e","Type":"ContainerStarted","Data":"2aef0879d9495173a48c808d252ca52eaf17be2dcc1b288aae31c8149d490bc8"} Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.010518 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dpfg2" podStartSLOduration=134.010495955 podStartE2EDuration="2m14.010495955s" podCreationTimestamp="2025-11-22 04:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:10:05.007521452 +0000 UTC m=+156.350142649" watchObservedRunningTime="2025-11-22 04:10:05.010495955 +0000 UTC m=+156.353117152" Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.020114 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:10:05 crc kubenswrapper[4699]: E1122 04:10:05.020579 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:10:05.520555542 +0000 UTC m=+156.863176739 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.026170 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mdzbj" event={"ID":"44f5f1a9-edac-427c-b170-affcaa869772","Type":"ContainerStarted","Data":"9bd4868c34b9477349b6a958ac8c880f18e82bf85c2037e63a4802b64b808113"} Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.026227 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mdzbj" event={"ID":"44f5f1a9-edac-427c-b170-affcaa869772","Type":"ContainerStarted","Data":"f8e0f5ebc41c803beb787d86f61823a381ff8ad1b55f48ee09d3da4e3513352d"} Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.036000 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5vkz8" podStartSLOduration=134.0359796 podStartE2EDuration="2m14.0359796s" podCreationTimestamp="2025-11-22 04:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:10:05.035882298 +0000 UTC m=+156.378503495" watchObservedRunningTime="2025-11-22 04:10:05.0359796 +0000 UTC m=+156.378600787" Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.061928 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-jlk5m" event={"ID":"07ed42bd-25e2-43de-bbd7-431ab818b761","Type":"ContainerStarted","Data":"2c19f698da261b87b80d850225d837c422dd0640b44cc92b3b2007038f2c7d2d"} Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.069543 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dvbjp" event={"ID":"a22f23a9-587d-4796-a25c-6563be7a2792","Type":"ContainerStarted","Data":"bd68a05e1843e2b65ba5500eb1d849cb25bd9c063e988e988b57a929a80bd2d4"} Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.075614 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9pj89" event={"ID":"28796a4d-fde0-4f6e-9a06-8f72bfba6473","Type":"ContainerStarted","Data":"aa4ba7e5c7cbac23694859423f3e1be56924428b111f64096cc385ae4a082a92"} Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.078564 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v8rsv" event={"ID":"0f099243-7cbe-4d7e-9e5e-062ed53026cb","Type":"ContainerStarted","Data":"c17de53447460b4be322ff189ebb60f0b76e84a042a1f4453f04554f2b334547"} Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.078642 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v8rsv" event={"ID":"0f099243-7cbe-4d7e-9e5e-062ed53026cb","Type":"ContainerStarted","Data":"e9e2bdaa86f76577dccbceaae7562e1026624a98a9130543942a5f9a86518e08"} Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.123048 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:05 crc kubenswrapper[4699]: E1122 04:10:05.125203 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:10:05.625179994 +0000 UTC m=+156.967801181 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4f67" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.127014 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w5jfl" event={"ID":"7343df3b-7616-42dd-8e27-5f9a2031a8d9","Type":"ContainerStarted","Data":"7cbc7fed7dff32cc6bf54615049ca5a59084a2b3e71b3a9831130f706fbc5311"} Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.132906 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-v8rsv" podStartSLOduration=134.132884883 podStartE2EDuration="2m14.132884883s" podCreationTimestamp="2025-11-22 04:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:10:05.124584759 +0000 UTC m=+156.467205966" watchObservedRunningTime="2025-11-22 04:10:05.132884883 +0000 UTC m=+156.475506080" Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.133728 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7t7r7" event={"ID":"98307b62-646c-4668-83a2-d5f741435197","Type":"ContainerStarted","Data":"d14b0594cd2bcac52e6ec8261f2db4f5aa27f78f56b8049cf144aa61fefc9661"} Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.133767 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7t7r7" event={"ID":"98307b62-646c-4668-83a2-d5f741435197","Type":"ContainerStarted","Data":"2f44c78cded08c57e48aad1c00fbf787b6c3c0140fd2e14410f576205e11bed6"} Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.180821 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-nl7ht" event={"ID":"d9fca12a-5420-4c0f-90fc-05333ca3353f","Type":"ContainerStarted","Data":"326c2f9c3e21327aae39eff2efeb619d3d4d52138dc0774519e272788ff2b110"} Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.195966 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-fqf9j" event={"ID":"9fa4879e-eded-43e9-816f-151c5f0263cc","Type":"ContainerStarted","Data":"6d93fbcaa5a70514ad6f01879d67e111c004ae677a6d2bfd43f0dec86944d65b"} Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.201768 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fxd9l" event={"ID":"11ed22c0-5989-4ddf-b4fb-290cad527cbc","Type":"ContainerStarted","Data":"cca8da7791bd74a4605b47c558f476d4a3b1252b7a6e6d1aa22d6dfa7bbd8069"} Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.212517 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-n4cff" event={"ID":"4e4eacc6-98af-4a4d-a161-a8629b46c1ef","Type":"ContainerStarted","Data":"53023f3c3206da46b35bc2ea92827d1741e15d332d99e126022703a1fe1283f9"} Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.212578 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-n4cff" event={"ID":"4e4eacc6-98af-4a4d-a161-a8629b46c1ef","Type":"ContainerStarted","Data":"155c574f9805e97ab88a279abfe4f1a79ff96ceda761173761fd0aff4c67812f"} Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.224031 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:10:05 crc kubenswrapper[4699]: E1122 04:10:05.229421 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:10:05.729393507 +0000 UTC m=+157.072014694 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.229897 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nfp7k" event={"ID":"e597a0aa-8325-4b8b-8691-c6b2a55bc714","Type":"ContainerStarted","Data":"97c1385ea1c9556b5ba2a8234cc0f9ea5a821dd2fdc1a94ddbb31a93c37de567"} Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.229958 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nfp7k" event={"ID":"e597a0aa-8325-4b8b-8691-c6b2a55bc714","Type":"ContainerStarted","Data":"eba6263e281a3cedf12507f24b67b974d45239f43b396c5c5b9ba30b32f6540d"} Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.262320 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-nl7ht" podStartSLOduration=135.262284425 podStartE2EDuration="2m15.262284425s" podCreationTimestamp="2025-11-22 04:07:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:10:05.261892236 +0000 UTC m=+156.604513433" watchObservedRunningTime="2025-11-22 04:10:05.262284425 +0000 UTC m=+156.604905612" Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.292515 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tnb4t" event={"ID":"3b82b9e9-6fee-42bd-8cdd-3bacf580f98e","Type":"ContainerStarted","Data":"95df560db6573331b9609e152ee50f25bf4d3dca2980bd8ed8d27ca92d2bf941"} Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.292907 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tnb4t" event={"ID":"3b82b9e9-6fee-42bd-8cdd-3bacf580f98e","Type":"ContainerStarted","Data":"a0d1940ec1d91429feb61be24be7ff4004612398bac32cad47501f20e15ef311"} Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.294718 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-fqf9j" podStartSLOduration=134.294700843 podStartE2EDuration="2m14.294700843s" podCreationTimestamp="2025-11-22 04:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:10:05.293921923 +0000 UTC m=+156.636543110" watchObservedRunningTime="2025-11-22 04:10:05.294700843 +0000 UTC m=+156.637322030" Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.296960 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-cqt2c" event={"ID":"a8098591-7b9f-4330-90f0-4181570d05b3","Type":"ContainerStarted","Data":"a81aa3bae081beb27517e7ba028e320ee60d61752075988f92c4142964a738c8"} Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.320878 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-n4cff" podStartSLOduration=134.320857926 podStartE2EDuration="2m14.320857926s" podCreationTimestamp="2025-11-22 04:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:10:05.318049977 +0000 UTC m=+156.660671174" watchObservedRunningTime="2025-11-22 04:10:05.320857926 +0000 UTC m=+156.663479113" Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.324006 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xwbdh" event={"ID":"f331d75b-ad6e-4adc-88be-379c031c7d22","Type":"ContainerStarted","Data":"6f469bbff8f1f313c6282367a1ff7592c734eb01f037afd81fccd25c9166962b"} Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.332013 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:05 crc kubenswrapper[4699]: E1122 04:10:05.336557 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:10:05.836540361 +0000 UTC m=+157.179161548 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4f67" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.338453 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tnb4t" podStartSLOduration=134.338413988 podStartE2EDuration="2m14.338413988s" podCreationTimestamp="2025-11-22 04:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:10:05.337446184 +0000 UTC m=+156.680067361" watchObservedRunningTime="2025-11-22 04:10:05.338413988 +0000 UTC m=+156.681035165" Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.418862 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-lw7n2" event={"ID":"7dcf8242-11fe-41bf-babf-825d87eabd70","Type":"ContainerStarted","Data":"2e3ef38c1e7fbe1f9442e441b366e0d4b5e28da93f95e0b2701a4e2bccd53323"} Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.427096 4699 generic.go:334] "Generic (PLEG): container finished" podID="b4be8553-5539-4124-8106-6e65ba593dad" containerID="f93ae39c1be2756841e82cf9359d1a1d6ae6c219e5f6b3b223db4ec1cf838db9" exitCode=0 Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.427174 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8v9k5" event={"ID":"b4be8553-5539-4124-8106-6e65ba593dad","Type":"ContainerDied","Data":"f93ae39c1be2756841e82cf9359d1a1d6ae6c219e5f6b3b223db4ec1cf838db9"} Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.430904 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-vdc7f" event={"ID":"08e24326-1ea6-4ab6-bcac-c58f25a88358","Type":"ContainerStarted","Data":"5debe3f5949104610bc2df0cc4e4eb147e044b50319bc9da39feff7d3e9d1dbf"} Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.433953 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:10:05 crc kubenswrapper[4699]: E1122 04:10:05.435455 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:10:05.935411023 +0000 UTC m=+157.278032210 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.468808 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xwbdh" podStartSLOduration=134.468786124 podStartE2EDuration="2m14.468786124s" podCreationTimestamp="2025-11-22 04:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:10:05.376084094 +0000 UTC m=+156.718705291" watchObservedRunningTime="2025-11-22 04:10:05.468786124 +0000 UTC m=+156.811407311" Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.509518 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-shhjp" event={"ID":"88c8894c-2a5e-4a5e-b3a2-83f266a23143","Type":"ContainerStarted","Data":"5831a8532c2406d1be56a5815ec4de6096cca8aa0a1767f7c454d2fc698f323b"} Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.509564 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-shhjp" event={"ID":"88c8894c-2a5e-4a5e-b3a2-83f266a23143","Type":"ContainerStarted","Data":"67d2f6f153c6be8f0b73a190b4acd39b2a7f0842e4f00cb5a55d829112cf964b"} Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.517518 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-x4855" event={"ID":"4cecc497-1936-439e-ab82-eeaa6bf9dc0b","Type":"ContainerStarted","Data":"e5803057bd6e7273950fee1e2cd56a9901047e955485c9cad61d89b66094890c"} Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.538515 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:05 crc kubenswrapper[4699]: E1122 04:10:05.540525 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:10:06.040508827 +0000 UTC m=+157.383130184 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4f67" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.549893 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-dpsxh" event={"ID":"0a91cd3c-7e69-4122-b256-f2fb9a6e0f0d","Type":"ContainerStarted","Data":"885dc667ce86f44f26db8eed8f585bceba314f59e84e346787e00f86a7e5e1fe"} Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.559149 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-5nbqq" event={"ID":"c93ff986-8043-4c13-b1a1-d24305361338","Type":"ContainerStarted","Data":"13aec7b32fdc5fe0bb2a8427054e3304bca90c77a0660c997ae8da168012ee8c"} Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.568711 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-8jvnn" event={"ID":"d9944e3b-40c6-48c2-8ccc-c38eb2304e63","Type":"ContainerStarted","Data":"191fa6c7d76c0e0a1904ac327f1b1141f29f6919515f66d61bd3ceb7481c8375"} Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.585610 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-x4855" podStartSLOduration=6.585577185 podStartE2EDuration="6.585577185s" podCreationTimestamp="2025-11-22 04:09:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:10:05.568303961 +0000 UTC m=+156.910925148" watchObservedRunningTime="2025-11-22 04:10:05.585577185 +0000 UTC m=+156.928198372" Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.608271 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-8jvnn" Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.615693 4699 generic.go:334] "Generic (PLEG): container finished" podID="85bf6018-799b-4301-9b19-f68749c028b4" containerID="0c149c343d9d084268466be38d905d66a20dee18f9e46dec1a41ade495c6acf4" exitCode=0 Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.615858 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-c655t" event={"ID":"85bf6018-799b-4301-9b19-f68749c028b4","Type":"ContainerDied","Data":"0c149c343d9d084268466be38d905d66a20dee18f9e46dec1a41ade495c6acf4"} Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.615899 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-c655t" event={"ID":"85bf6018-799b-4301-9b19-f68749c028b4","Type":"ContainerStarted","Data":"a3a7aa69e83a804178cfb1b4194cc8fbba055114589fef9d2d563cb079f54ad8"} Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.640017 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.642678 4699 patch_prober.go:28] interesting pod/console-operator-58897d9998-8jvnn container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/readyz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.642764 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-8jvnn" podUID="d9944e3b-40c6-48c2-8ccc-c38eb2304e63" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/readyz\": dial tcp 10.217.0.20:8443: connect: connection refused" Nov 22 04:10:05 crc kubenswrapper[4699]: E1122 04:10:05.644399 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:10:06.144375971 +0000 UTC m=+157.486997158 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.656291 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7l8mf" event={"ID":"bbfeedc6-bc1b-4ea0-8e36-d97f8529c69f","Type":"ContainerStarted","Data":"9f239a4ab8fe0bd6c14c0c86d09731fe2b9030598ae1a70bc8b4723dd237ba46"} Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.704475 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-h4sqq" event={"ID":"ae0dbcde-8d79-4c2a-8c0b-6e64c29d52d0","Type":"ContainerStarted","Data":"b629a23b77326e2e1a84600d32cd00c22a41bcac2ad3cfdd6338f80f89ee8f03"} Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.705874 4699 patch_prober.go:28] interesting pod/downloads-7954f5f757-2gxkc container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.705954 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2gxkc" podUID="854975d3-e251-4354-a2c8-84ea7da296a8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.721731 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b66nb" Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.726586 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-4dvpp" Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.730752 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jtp76" Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.743014 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:05 crc kubenswrapper[4699]: E1122 04:10:05.762766 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:10:06.262744302 +0000 UTC m=+157.605365479 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4f67" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.782109 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-5nbqq" podStartSLOduration=134.782082748 podStartE2EDuration="2m14.782082748s" podCreationTimestamp="2025-11-22 04:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:10:05.765754296 +0000 UTC m=+157.108375483" watchObservedRunningTime="2025-11-22 04:10:05.782082748 +0000 UTC m=+157.124703935" Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.849000 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:10:05 crc kubenswrapper[4699]: E1122 04:10:05.849313 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:10:06.349289511 +0000 UTC m=+157.691910698 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.849595 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:05 crc kubenswrapper[4699]: E1122 04:10:05.850106 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:10:06.35007929 +0000 UTC m=+157.692700477 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4f67" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.953219 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.955938 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-8jvnn" podStartSLOduration=134.955923373 podStartE2EDuration="2m14.955923373s" podCreationTimestamp="2025-11-22 04:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:10:05.914648388 +0000 UTC m=+157.257269575" watchObservedRunningTime="2025-11-22 04:10:05.955923373 +0000 UTC m=+157.298544560" Nov 22 04:10:05 crc kubenswrapper[4699]: E1122 04:10:05.957638 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:10:06.457614304 +0000 UTC m=+157.800235491 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.962282 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:05 crc kubenswrapper[4699]: E1122 04:10:05.962867 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:10:06.462853983 +0000 UTC m=+157.805475170 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4f67" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.986120 4699 patch_prober.go:28] interesting pod/router-default-5444994796-842kh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 04:10:05 crc kubenswrapper[4699]: [-]has-synced failed: reason withheld Nov 22 04:10:05 crc kubenswrapper[4699]: [+]process-running ok Nov 22 04:10:05 crc kubenswrapper[4699]: healthz check failed Nov 22 04:10:05 crc kubenswrapper[4699]: I1122 04:10:05.986182 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-842kh" podUID="40353ee4-6a92-4e39-be6f-b8249f523e36" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.046808 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-7l8mf" podStartSLOduration=135.046790967 podStartE2EDuration="2m15.046790967s" podCreationTimestamp="2025-11-22 04:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:10:06.046713125 +0000 UTC m=+157.389334312" watchObservedRunningTime="2025-11-22 04:10:06.046790967 +0000 UTC m=+157.389412154" Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.064017 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:10:06 crc kubenswrapper[4699]: E1122 04:10:06.064389 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:10:06.56437152 +0000 UTC m=+157.906992697 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.165324 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:06 crc kubenswrapper[4699]: E1122 04:10:06.165745 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:10:06.665728012 +0000 UTC m=+158.008349199 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4f67" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.178725 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-h4sqq" podStartSLOduration=135.178696891 podStartE2EDuration="2m15.178696891s" podCreationTimestamp="2025-11-22 04:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:10:06.177358928 +0000 UTC m=+157.519980125" watchObservedRunningTime="2025-11-22 04:10:06.178696891 +0000 UTC m=+157.521318078" Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.270122 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:10:06 crc kubenswrapper[4699]: E1122 04:10:06.270645 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:10:06.770623272 +0000 UTC m=+158.113244459 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.374156 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:06 crc kubenswrapper[4699]: E1122 04:10:06.374587 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:10:06.874572738 +0000 UTC m=+158.217193925 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4f67" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.479348 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:10:06 crc kubenswrapper[4699]: E1122 04:10:06.479877 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:10:06.979845637 +0000 UTC m=+158.322466824 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.480171 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:06 crc kubenswrapper[4699]: E1122 04:10:06.480609 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:10:06.980592645 +0000 UTC m=+158.323213832 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4f67" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.581375 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:10:06 crc kubenswrapper[4699]: E1122 04:10:06.581700 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:10:07.08165941 +0000 UTC m=+158.424280597 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.581865 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:06 crc kubenswrapper[4699]: E1122 04:10:06.582296 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:10:07.082278806 +0000 UTC m=+158.424899993 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4f67" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.691133 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:10:06 crc kubenswrapper[4699]: E1122 04:10:06.691645 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:10:07.191624084 +0000 UTC m=+158.534245271 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.724534 4699 generic.go:334] "Generic (PLEG): container finished" podID="44f5f1a9-edac-427c-b170-affcaa869772" containerID="9bd4868c34b9477349b6a958ac8c880f18e82bf85c2037e63a4802b64b808113" exitCode=0 Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.724658 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mdzbj" event={"ID":"44f5f1a9-edac-427c-b170-affcaa869772","Type":"ContainerDied","Data":"9bd4868c34b9477349b6a958ac8c880f18e82bf85c2037e63a4802b64b808113"} Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.724690 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mdzbj" event={"ID":"44f5f1a9-edac-427c-b170-affcaa869772","Type":"ContainerStarted","Data":"e5d8570163f331a00e69c159f0aefb351efec91a535f77751ba0726ddab8177c"} Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.725554 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mdzbj" Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.740674 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w5jfl" event={"ID":"7343df3b-7616-42dd-8e27-5f9a2031a8d9","Type":"ContainerStarted","Data":"7f32bb46d9aa24b3f8b8b42325ce4aaa02965f27f7d0541c96afe329a2ae142b"} Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.754601 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mdzbj" podStartSLOduration=135.754574092 podStartE2EDuration="2m15.754574092s" podCreationTimestamp="2025-11-22 04:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:10:06.750273557 +0000 UTC m=+158.092894744" watchObservedRunningTime="2025-11-22 04:10:06.754574092 +0000 UTC m=+158.097195279" Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.756347 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-fqf9j" event={"ID":"9fa4879e-eded-43e9-816f-151c5f0263cc","Type":"ContainerStarted","Data":"6148c8ade53122a63cba8f3ff0cd57bfc4161b9e6f9c5750bfadcbcd33711741"} Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.771636 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9pj89" event={"ID":"28796a4d-fde0-4f6e-9a06-8f72bfba6473","Type":"ContainerStarted","Data":"d6076f5eed758ad66c1ec53ec9f795e616bd4faff09a688d0616ab682adf5eaa"} Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.772614 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-9pj89" Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.774675 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-lw7n2" event={"ID":"7dcf8242-11fe-41bf-babf-825d87eabd70","Type":"ContainerStarted","Data":"6460d897a4a49a362ba1916d2b7dc0cf7deda40e0c6d2f484c98aea2d591f9cc"} Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.777769 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w5jfl" podStartSLOduration=135.777749882 podStartE2EDuration="2m15.777749882s" podCreationTimestamp="2025-11-22 04:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:10:06.775867646 +0000 UTC m=+158.118488833" watchObservedRunningTime="2025-11-22 04:10:06.777749882 +0000 UTC m=+158.120371069" Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.785188 4699 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-9pj89 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.785245 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-9pj89" podUID="28796a4d-fde0-4f6e-9a06-8f72bfba6473" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.795056 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-vdc7f" event={"ID":"08e24326-1ea6-4ab6-bcac-c58f25a88358","Type":"ContainerStarted","Data":"b06f313f8001571b709c7c1d88cc3a441c0eb882a63716c9977b626fb00dd825"} Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.822601 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:06 crc kubenswrapper[4699]: E1122 04:10:06.825026 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:10:07.325006075 +0000 UTC m=+158.667627262 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4f67" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.828413 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-8jvnn" event={"ID":"d9944e3b-40c6-48c2-8ccc-c38eb2304e63","Type":"ContainerStarted","Data":"d53d1728d6b1c7a86c7cdc76145c254df95fe653e3cb10e1f83c523993edfcbc"} Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.829571 4699 patch_prober.go:28] interesting pod/console-operator-58897d9998-8jvnn container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/readyz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.829676 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-8jvnn" podUID="d9944e3b-40c6-48c2-8ccc-c38eb2304e63" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/readyz\": dial tcp 10.217.0.20:8443: connect: connection refused" Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.836499 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8v9k5" event={"ID":"b4be8553-5539-4124-8106-6e65ba593dad","Type":"ContainerStarted","Data":"4f59d0afef7479718be26fb1cb0afc8ba0fa0174587d91bdae40052abbe7349f"} Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.840974 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-shhjp" event={"ID":"88c8894c-2a5e-4a5e-b3a2-83f266a23143","Type":"ContainerStarted","Data":"e3c5002cb706d4a89a872a5cab8b3e48e9f340fabc0a1f444e6cfc946c841ce7"} Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.845945 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-x4855" event={"ID":"4cecc497-1936-439e-ab82-eeaa6bf9dc0b","Type":"ContainerStarted","Data":"8d90a1f4737e62f0dbab8a9a9ef19536141a875ce80d55bcc9912ee5361f0832"} Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.849529 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-c655t" event={"ID":"85bf6018-799b-4301-9b19-f68749c028b4","Type":"ContainerStarted","Data":"d4d3965ddbef2e13eb06efa68af3b3714db1abbf05ee34b6346e371e5d447686"} Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.872278 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nfp7k" event={"ID":"e597a0aa-8325-4b8b-8691-c6b2a55bc714","Type":"ContainerStarted","Data":"b325afafeb0458486e8108c5805766e66a3432d0be3da1a40001ed8709e2ba56"} Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.873978 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8v9k5" podStartSLOduration=135.873966429 podStartE2EDuration="2m15.873966429s" podCreationTimestamp="2025-11-22 04:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:10:06.873910777 +0000 UTC m=+158.216531964" watchObservedRunningTime="2025-11-22 04:10:06.873966429 +0000 UTC m=+158.216587616" Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.878356 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-9pj89" podStartSLOduration=135.878332216 podStartE2EDuration="2m15.878332216s" podCreationTimestamp="2025-11-22 04:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:10:06.82197672 +0000 UTC m=+158.164597927" watchObservedRunningTime="2025-11-22 04:10:06.878332216 +0000 UTC m=+158.220953403" Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.889419 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-jlk5m" event={"ID":"07ed42bd-25e2-43de-bbd7-431ab818b761","Type":"ContainerStarted","Data":"ce342d5614e170aa71a58f5fcb8155d75675c5f39db0255e4c3e0a4526128545"} Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.893175 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-cqt2c" event={"ID":"a8098591-7b9f-4330-90f0-4181570d05b3","Type":"ContainerStarted","Data":"96653f0a01f146e190b3a512f954078f8b6f50501d0d46839566d3eaa15e5b34"} Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.896379 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fxd9l" event={"ID":"11ed22c0-5989-4ddf-b4fb-290cad527cbc","Type":"ContainerStarted","Data":"6b30407c38ec92a60c943e72cac082f116b239baeb26dc4107b090e20e5aa2c5"} Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.900258 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4s69k" event={"ID":"6128e348-e15b-4e9b-adc7-851ae384ec4d","Type":"ContainerStarted","Data":"b20ceb5b390547b74330631e05de26037811604b64685ad4d3b6bc068525f369"} Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.917096 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7t7r7" event={"ID":"98307b62-646c-4668-83a2-d5f741435197","Type":"ContainerStarted","Data":"e4e1164404829106443d40d11b3f1ef1a95bb992b1af6905f0e2fc456ead4244"} Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.925501 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:10:06 crc kubenswrapper[4699]: E1122 04:10:06.926783 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:10:07.426761857 +0000 UTC m=+158.769383044 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.945268 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-h4sqq" event={"ID":"ae0dbcde-8d79-4c2a-8c0b-6e64c29d52d0","Type":"ContainerStarted","Data":"c2cbf4128b5d140a0e40b46b33f405b8e2a233d135ad4efa83c2809584c35478"} Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.971441 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xwbdh" event={"ID":"f331d75b-ad6e-4adc-88be-379c031c7d22","Type":"ContainerStarted","Data":"90b6f56adea8e28f731c3db2ea6b7d1d282617cdcaa6c569e49fad0e911e85a7"} Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.971941 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-shhjp" podStartSLOduration=135.971906237 podStartE2EDuration="2m15.971906237s" podCreationTimestamp="2025-11-22 04:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:10:06.911156113 +0000 UTC m=+158.253777320" watchObservedRunningTime="2025-11-22 04:10:06.971906237 +0000 UTC m=+158.314527414" Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.980110 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8v9k5" Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.980520 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8v9k5" Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.997124 4699 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-8v9k5 container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.35:8443/livez\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.997203 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8v9k5" podUID="b4be8553-5539-4124-8106-6e65ba593dad" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.35:8443/livez\": dial tcp 10.217.0.35:8443: connect: connection refused" Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.997370 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5vkz8" Nov 22 04:10:06 crc kubenswrapper[4699]: I1122 04:10:06.998315 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-cqt2c" podStartSLOduration=135.998292986 podStartE2EDuration="2m15.998292986s" podCreationTimestamp="2025-11-22 04:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:10:06.998263205 +0000 UTC m=+158.340884392" watchObservedRunningTime="2025-11-22 04:10:06.998292986 +0000 UTC m=+158.340914173" Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:06.999728 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nfp7k" podStartSLOduration=135.999716261 podStartE2EDuration="2m15.999716261s" podCreationTimestamp="2025-11-22 04:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:10:06.958366754 +0000 UTC m=+158.300987951" watchObservedRunningTime="2025-11-22 04:10:06.999716261 +0000 UTC m=+158.342337438" Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.006852 4699 patch_prober.go:28] interesting pod/router-default-5444994796-842kh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 04:10:07 crc kubenswrapper[4699]: [-]has-synced failed: reason withheld Nov 22 04:10:07 crc kubenswrapper[4699]: [+]process-running ok Nov 22 04:10:07 crc kubenswrapper[4699]: healthz check failed Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.006936 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-842kh" podUID="40353ee4-6a92-4e39-be6f-b8249f523e36" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.027688 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:07 crc kubenswrapper[4699]: E1122 04:10:07.036908 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:10:07.536889665 +0000 UTC m=+158.879510842 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4f67" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.055365 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-jlk5m" podStartSLOduration=136.055348179 podStartE2EDuration="2m16.055348179s" podCreationTimestamp="2025-11-22 04:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:10:07.048663605 +0000 UTC m=+158.391284812" watchObservedRunningTime="2025-11-22 04:10:07.055348179 +0000 UTC m=+158.397969366" Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.114320 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-fxd9l" podStartSLOduration=136.114299909 podStartE2EDuration="2m16.114299909s" podCreationTimestamp="2025-11-22 04:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:10:07.112061144 +0000 UTC m=+158.454682351" watchObservedRunningTime="2025-11-22 04:10:07.114299909 +0000 UTC m=+158.456921096" Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.134221 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:10:07 crc kubenswrapper[4699]: E1122 04:10:07.134883 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:10:07.634861284 +0000 UTC m=+158.977482471 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.238539 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:07 crc kubenswrapper[4699]: E1122 04:10:07.238939 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:10:07.738925213 +0000 UTC m=+159.081546400 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4f67" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.280246 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7t7r7" podStartSLOduration=136.280220519 podStartE2EDuration="2m16.280220519s" podCreationTimestamp="2025-11-22 04:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:10:07.192900282 +0000 UTC m=+158.535521489" watchObservedRunningTime="2025-11-22 04:10:07.280220519 +0000 UTC m=+158.622841706" Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.340539 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:10:07 crc kubenswrapper[4699]: E1122 04:10:07.341346 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:10:07.841327962 +0000 UTC m=+159.183949139 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.443127 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:07 crc kubenswrapper[4699]: E1122 04:10:07.443537 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:10:07.943523585 +0000 UTC m=+159.286144772 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4f67" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.488667 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bnvlx"] Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.490072 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bnvlx" Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.492861 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.516427 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bnvlx"] Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.545017 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.545419 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3394e2db-ddcc-4a0a-94d3-9336fabf5ca5-catalog-content\") pod \"community-operators-bnvlx\" (UID: \"3394e2db-ddcc-4a0a-94d3-9336fabf5ca5\") " pod="openshift-marketplace/community-operators-bnvlx" Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.545531 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn87l\" (UniqueName: \"kubernetes.io/projected/3394e2db-ddcc-4a0a-94d3-9336fabf5ca5-kube-api-access-tn87l\") pod \"community-operators-bnvlx\" (UID: \"3394e2db-ddcc-4a0a-94d3-9336fabf5ca5\") " pod="openshift-marketplace/community-operators-bnvlx" Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.545581 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3394e2db-ddcc-4a0a-94d3-9336fabf5ca5-utilities\") pod \"community-operators-bnvlx\" (UID: \"3394e2db-ddcc-4a0a-94d3-9336fabf5ca5\") " pod="openshift-marketplace/community-operators-bnvlx" Nov 22 04:10:07 crc kubenswrapper[4699]: E1122 04:10:07.545641 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:10:08.045612285 +0000 UTC m=+159.388233472 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.545694 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:07 crc kubenswrapper[4699]: E1122 04:10:07.546049 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:10:08.046021295 +0000 UTC m=+159.388642482 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4f67" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.622543 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mwc7g"] Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.623711 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mwc7g" Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.630717 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.647037 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.647790 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn87l\" (UniqueName: \"kubernetes.io/projected/3394e2db-ddcc-4a0a-94d3-9336fabf5ca5-kube-api-access-tn87l\") pod \"community-operators-bnvlx\" (UID: \"3394e2db-ddcc-4a0a-94d3-9336fabf5ca5\") " pod="openshift-marketplace/community-operators-bnvlx" Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.647849 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3394e2db-ddcc-4a0a-94d3-9336fabf5ca5-utilities\") pod \"community-operators-bnvlx\" (UID: \"3394e2db-ddcc-4a0a-94d3-9336fabf5ca5\") " pod="openshift-marketplace/community-operators-bnvlx" Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.647965 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3394e2db-ddcc-4a0a-94d3-9336fabf5ca5-catalog-content\") pod \"community-operators-bnvlx\" (UID: \"3394e2db-ddcc-4a0a-94d3-9336fabf5ca5\") " pod="openshift-marketplace/community-operators-bnvlx" Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.648808 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3394e2db-ddcc-4a0a-94d3-9336fabf5ca5-catalog-content\") pod \"community-operators-bnvlx\" (UID: \"3394e2db-ddcc-4a0a-94d3-9336fabf5ca5\") " pod="openshift-marketplace/community-operators-bnvlx" Nov 22 04:10:07 crc kubenswrapper[4699]: E1122 04:10:07.648911 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:10:08.148889245 +0000 UTC m=+159.491510432 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.649593 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3394e2db-ddcc-4a0a-94d3-9336fabf5ca5-utilities\") pod \"community-operators-bnvlx\" (UID: \"3394e2db-ddcc-4a0a-94d3-9336fabf5ca5\") " pod="openshift-marketplace/community-operators-bnvlx" Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.664486 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mwc7g"] Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.715225 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn87l\" (UniqueName: \"kubernetes.io/projected/3394e2db-ddcc-4a0a-94d3-9336fabf5ca5-kube-api-access-tn87l\") pod \"community-operators-bnvlx\" (UID: \"3394e2db-ddcc-4a0a-94d3-9336fabf5ca5\") " pod="openshift-marketplace/community-operators-bnvlx" Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.753154 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e9094ed-d247-4427-86ce-adf048713377-utilities\") pod \"certified-operators-mwc7g\" (UID: \"5e9094ed-d247-4427-86ce-adf048713377\") " pod="openshift-marketplace/certified-operators-mwc7g" Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.753200 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkkbz\" (UniqueName: \"kubernetes.io/projected/5e9094ed-d247-4427-86ce-adf048713377-kube-api-access-dkkbz\") pod \"certified-operators-mwc7g\" (UID: \"5e9094ed-d247-4427-86ce-adf048713377\") " pod="openshift-marketplace/certified-operators-mwc7g" Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.753257 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e9094ed-d247-4427-86ce-adf048713377-catalog-content\") pod \"certified-operators-mwc7g\" (UID: \"5e9094ed-d247-4427-86ce-adf048713377\") " pod="openshift-marketplace/certified-operators-mwc7g" Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.753301 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:07 crc kubenswrapper[4699]: E1122 04:10:07.753628 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:10:08.25361632 +0000 UTC m=+159.596237507 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4f67" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.812829 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bnvlx" Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.836698 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fd7rc"] Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.851310 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fd7rc" Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.856061 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.856348 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e9094ed-d247-4427-86ce-adf048713377-catalog-content\") pod \"certified-operators-mwc7g\" (UID: \"5e9094ed-d247-4427-86ce-adf048713377\") " pod="openshift-marketplace/certified-operators-mwc7g" Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.856422 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e9094ed-d247-4427-86ce-adf048713377-utilities\") pod \"certified-operators-mwc7g\" (UID: \"5e9094ed-d247-4427-86ce-adf048713377\") " pod="openshift-marketplace/certified-operators-mwc7g" Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.856467 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkkbz\" (UniqueName: \"kubernetes.io/projected/5e9094ed-d247-4427-86ce-adf048713377-kube-api-access-dkkbz\") pod \"certified-operators-mwc7g\" (UID: \"5e9094ed-d247-4427-86ce-adf048713377\") " pod="openshift-marketplace/certified-operators-mwc7g" Nov 22 04:10:07 crc kubenswrapper[4699]: E1122 04:10:07.856948 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:10:08.356929361 +0000 UTC m=+159.699550548 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.857393 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e9094ed-d247-4427-86ce-adf048713377-catalog-content\") pod \"certified-operators-mwc7g\" (UID: \"5e9094ed-d247-4427-86ce-adf048713377\") " pod="openshift-marketplace/certified-operators-mwc7g" Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.857634 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e9094ed-d247-4427-86ce-adf048713377-utilities\") pod \"certified-operators-mwc7g\" (UID: \"5e9094ed-d247-4427-86ce-adf048713377\") " pod="openshift-marketplace/certified-operators-mwc7g" Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.868620 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fd7rc"] Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.925800 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkkbz\" (UniqueName: \"kubernetes.io/projected/5e9094ed-d247-4427-86ce-adf048713377-kube-api-access-dkkbz\") pod \"certified-operators-mwc7g\" (UID: \"5e9094ed-d247-4427-86ce-adf048713377\") " pod="openshift-marketplace/certified-operators-mwc7g" Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.952526 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mwc7g" Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.957647 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/444eee36-7eda-4b9c-9609-decdc3fb841b-utilities\") pod \"community-operators-fd7rc\" (UID: \"444eee36-7eda-4b9c-9609-decdc3fb841b\") " pod="openshift-marketplace/community-operators-fd7rc" Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.957702 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k2wh\" (UniqueName: \"kubernetes.io/projected/444eee36-7eda-4b9c-9609-decdc3fb841b-kube-api-access-2k2wh\") pod \"community-operators-fd7rc\" (UID: \"444eee36-7eda-4b9c-9609-decdc3fb841b\") " pod="openshift-marketplace/community-operators-fd7rc" Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.957735 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/444eee36-7eda-4b9c-9609-decdc3fb841b-catalog-content\") pod \"community-operators-fd7rc\" (UID: \"444eee36-7eda-4b9c-9609-decdc3fb841b\") " pod="openshift-marketplace/community-operators-fd7rc" Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.957780 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:07 crc kubenswrapper[4699]: E1122 04:10:07.958141 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:10:08.458125439 +0000 UTC m=+159.800746626 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4f67" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.984478 4699 patch_prober.go:28] interesting pod/router-default-5444994796-842kh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 04:10:07 crc kubenswrapper[4699]: [-]has-synced failed: reason withheld Nov 22 04:10:07 crc kubenswrapper[4699]: [+]process-running ok Nov 22 04:10:07 crc kubenswrapper[4699]: healthz check failed Nov 22 04:10:07 crc kubenswrapper[4699]: I1122 04:10:07.984545 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-842kh" podUID="40353ee4-6a92-4e39-be6f-b8249f523e36" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.031759 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8rqfc"] Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.033287 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8rqfc" Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.049357 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-lw7n2" event={"ID":"7dcf8242-11fe-41bf-babf-825d87eabd70","Type":"ContainerStarted","Data":"513cb8f761f4ec8659529c4ae3f998db7e4ae14b36bf4bf1609885a8985aeb09"} Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.059273 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.059518 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/444eee36-7eda-4b9c-9609-decdc3fb841b-utilities\") pod \"community-operators-fd7rc\" (UID: \"444eee36-7eda-4b9c-9609-decdc3fb841b\") " pod="openshift-marketplace/community-operators-fd7rc" Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.059565 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k2wh\" (UniqueName: \"kubernetes.io/projected/444eee36-7eda-4b9c-9609-decdc3fb841b-kube-api-access-2k2wh\") pod \"community-operators-fd7rc\" (UID: \"444eee36-7eda-4b9c-9609-decdc3fb841b\") " pod="openshift-marketplace/community-operators-fd7rc" Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.059601 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/444eee36-7eda-4b9c-9609-decdc3fb841b-catalog-content\") pod \"community-operators-fd7rc\" (UID: \"444eee36-7eda-4b9c-9609-decdc3fb841b\") " pod="openshift-marketplace/community-operators-fd7rc" Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.060263 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/444eee36-7eda-4b9c-9609-decdc3fb841b-catalog-content\") pod \"community-operators-fd7rc\" (UID: \"444eee36-7eda-4b9c-9609-decdc3fb841b\") " pod="openshift-marketplace/community-operators-fd7rc" Nov 22 04:10:08 crc kubenswrapper[4699]: E1122 04:10:08.060368 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:10:08.560335513 +0000 UTC m=+159.902956700 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.061223 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/444eee36-7eda-4b9c-9609-decdc3fb841b-utilities\") pod \"community-operators-fd7rc\" (UID: \"444eee36-7eda-4b9c-9609-decdc3fb841b\") " pod="openshift-marketplace/community-operators-fd7rc" Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.063211 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8rqfc"] Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.093520 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-vdc7f" event={"ID":"08e24326-1ea6-4ab6-bcac-c58f25a88358","Type":"ContainerStarted","Data":"aa6f82706d6a7efbc32dc45b4d4cd69c936f523e4182923a21c6b1e0ca01f212"} Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.100496 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k2wh\" (UniqueName: \"kubernetes.io/projected/444eee36-7eda-4b9c-9609-decdc3fb841b-kube-api-access-2k2wh\") pod \"community-operators-fd7rc\" (UID: \"444eee36-7eda-4b9c-9609-decdc3fb841b\") " pod="openshift-marketplace/community-operators-fd7rc" Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.146910 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-c655t" event={"ID":"85bf6018-799b-4301-9b19-f68749c028b4","Type":"ContainerStarted","Data":"8756a41363c42f98dad6fa837101dbc68c7df680a3467186766ebfb1e53af9ec"} Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.161339 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45-utilities\") pod \"certified-operators-8rqfc\" (UID: \"7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45\") " pod="openshift-marketplace/certified-operators-8rqfc" Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.161391 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.161517 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45-catalog-content\") pod \"certified-operators-8rqfc\" (UID: \"7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45\") " pod="openshift-marketplace/certified-operators-8rqfc" Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.161549 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnfnr\" (UniqueName: \"kubernetes.io/projected/7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45-kube-api-access-xnfnr\") pod \"certified-operators-8rqfc\" (UID: \"7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45\") " pod="openshift-marketplace/certified-operators-8rqfc" Nov 22 04:10:08 crc kubenswrapper[4699]: E1122 04:10:08.162800 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:10:08.662783412 +0000 UTC m=+160.005404599 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4f67" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.165444 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-vdc7f" podStartSLOduration=137.165415167 podStartE2EDuration="2m17.165415167s" podCreationTimestamp="2025-11-22 04:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:10:08.16310501 +0000 UTC m=+159.505726197" watchObservedRunningTime="2025-11-22 04:10:08.165415167 +0000 UTC m=+159.508036354" Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.193905 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4s69k" event={"ID":"6128e348-e15b-4e9b-adc7-851ae384ec4d","Type":"ContainerStarted","Data":"766a55695b3c77723d6e5b15812192418c09ad55ee62f0eba27c86ffe37cbef4"} Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.194864 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-4s69k" Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.196914 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-lw7n2" podStartSLOduration=137.196896401 podStartE2EDuration="2m17.196896401s" podCreationTimestamp="2025-11-22 04:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:10:08.195131527 +0000 UTC m=+159.537752714" watchObservedRunningTime="2025-11-22 04:10:08.196896401 +0000 UTC m=+159.539517588" Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.212620 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-gmb69" event={"ID":"65188bec-6189-4789-9b29-f8241a81302e","Type":"ContainerStarted","Data":"9c8c343ccdda86b556fdca834109f94bfe3500c5572d179979a1efd918269940"} Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.221385 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fd7rc" Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.230282 4699 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-mdzbj container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.230355 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mdzbj" podUID="44f5f1a9-edac-427c-b170-affcaa869772" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.236313 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-4s69k" podStartSLOduration=9.23628765 podStartE2EDuration="9.23628765s" podCreationTimestamp="2025-11-22 04:09:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:10:08.220523932 +0000 UTC m=+159.563145129" watchObservedRunningTime="2025-11-22 04:10:08.23628765 +0000 UTC m=+159.578908837" Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.241932 4699 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-mdzbj container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.241994 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mdzbj" podUID="44f5f1a9-edac-427c-b170-affcaa869772" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.242359 4699 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-mdzbj container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.242377 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mdzbj" podUID="44f5f1a9-edac-427c-b170-affcaa869772" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.265125 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.265547 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45-catalog-content\") pod \"certified-operators-8rqfc\" (UID: \"7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45\") " pod="openshift-marketplace/certified-operators-8rqfc" Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.265643 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnfnr\" (UniqueName: \"kubernetes.io/projected/7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45-kube-api-access-xnfnr\") pod \"certified-operators-8rqfc\" (UID: \"7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45\") " pod="openshift-marketplace/certified-operators-8rqfc" Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.265718 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45-utilities\") pod \"certified-operators-8rqfc\" (UID: \"7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45\") " pod="openshift-marketplace/certified-operators-8rqfc" Nov 22 04:10:08 crc kubenswrapper[4699]: E1122 04:10:08.266943 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:10:08.766925123 +0000 UTC m=+160.109546310 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.276021 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45-catalog-content\") pod \"certified-operators-8rqfc\" (UID: \"7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45\") " pod="openshift-marketplace/certified-operators-8rqfc" Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.289722 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-c655t" podStartSLOduration=138.289696273 podStartE2EDuration="2m18.289696273s" podCreationTimestamp="2025-11-22 04:07:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:10:08.28429447 +0000 UTC m=+159.626915667" watchObservedRunningTime="2025-11-22 04:10:08.289696273 +0000 UTC m=+159.632317460" Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.322847 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnfnr\" (UniqueName: \"kubernetes.io/projected/7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45-kube-api-access-xnfnr\") pod \"certified-operators-8rqfc\" (UID: \"7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45\") " pod="openshift-marketplace/certified-operators-8rqfc" Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.362283 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-9pj89" Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.371347 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:08 crc kubenswrapper[4699]: E1122 04:10:08.371723 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:10:08.87170745 +0000 UTC m=+160.214328637 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4f67" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.429384 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45-utilities\") pod \"certified-operators-8rqfc\" (UID: \"7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45\") " pod="openshift-marketplace/certified-operators-8rqfc" Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.474133 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:10:08 crc kubenswrapper[4699]: E1122 04:10:08.474566 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:10:08.974539358 +0000 UTC m=+160.317160545 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.519277 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bnvlx"] Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.577783 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:08 crc kubenswrapper[4699]: E1122 04:10:08.578272 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:10:09.078257599 +0000 UTC m=+160.420878786 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4f67" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.643084 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-8jvnn" Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.660795 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8rqfc" Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.682060 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:10:08 crc kubenswrapper[4699]: E1122 04:10:08.682534 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:10:09.182511442 +0000 UTC m=+160.525132639 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.726494 4699 patch_prober.go:28] interesting pod/machine-config-daemon-kjwnt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.726545 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.778365 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mwc7g"] Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.785241 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:08 crc kubenswrapper[4699]: E1122 04:10:08.785659 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:10:09.285646498 +0000 UTC m=+160.628267685 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4f67" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.893133 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:10:08 crc kubenswrapper[4699]: E1122 04:10:08.893695 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:10:09.393673284 +0000 UTC m=+160.736294471 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.930097 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fd7rc"] Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.986020 4699 patch_prober.go:28] interesting pod/router-default-5444994796-842kh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 04:10:08 crc kubenswrapper[4699]: [-]has-synced failed: reason withheld Nov 22 04:10:08 crc kubenswrapper[4699]: [+]process-running ok Nov 22 04:10:08 crc kubenswrapper[4699]: healthz check failed Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.986525 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-842kh" podUID="40353ee4-6a92-4e39-be6f-b8249f523e36" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 04:10:08 crc kubenswrapper[4699]: I1122 04:10:08.996385 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:08 crc kubenswrapper[4699]: E1122 04:10:08.996869 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:10:09.496853342 +0000 UTC m=+160.839474539 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4f67" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:09 crc kubenswrapper[4699]: I1122 04:10:09.102023 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:10:09 crc kubenswrapper[4699]: E1122 04:10:09.102589 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:10:09.602572931 +0000 UTC m=+160.945194118 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:09 crc kubenswrapper[4699]: I1122 04:10:09.204194 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:09 crc kubenswrapper[4699]: E1122 04:10:09.204524 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:10:09.704511208 +0000 UTC m=+161.047132395 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4f67" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:09 crc kubenswrapper[4699]: I1122 04:10:09.219864 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fd7rc" event={"ID":"444eee36-7eda-4b9c-9609-decdc3fb841b","Type":"ContainerStarted","Data":"e96d78e303782d2f04a118ac289b072eee0a5d3fad25753d29d600a1b9a9a223"} Nov 22 04:10:09 crc kubenswrapper[4699]: I1122 04:10:09.223293 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwc7g" event={"ID":"5e9094ed-d247-4427-86ce-adf048713377","Type":"ContainerStarted","Data":"8a771fe6cdb8d1754c51ba127afc7488bec05ba1b1764710a470a771f6149150"} Nov 22 04:10:09 crc kubenswrapper[4699]: I1122 04:10:09.223371 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwc7g" event={"ID":"5e9094ed-d247-4427-86ce-adf048713377","Type":"ContainerStarted","Data":"f7268dffb74f9895faf2867f70fbd436ef488c2e878eef075f7a3c019006ee6e"} Nov 22 04:10:09 crc kubenswrapper[4699]: I1122 04:10:09.225758 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bnvlx" event={"ID":"3394e2db-ddcc-4a0a-94d3-9336fabf5ca5","Type":"ContainerStarted","Data":"f4f418868f80920a74633b3fac64f0fa6b90a241c83f2e345347fc8c79a10384"} Nov 22 04:10:09 crc kubenswrapper[4699]: I1122 04:10:09.225819 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bnvlx" event={"ID":"3394e2db-ddcc-4a0a-94d3-9336fabf5ca5","Type":"ContainerStarted","Data":"e4d0c945ed188dc0642df27c32c45b7edeb214154c1e4f09af51745e6611f6e3"} Nov 22 04:10:09 crc kubenswrapper[4699]: I1122 04:10:09.305211 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:10:09 crc kubenswrapper[4699]: E1122 04:10:09.306615 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:10:09.806587268 +0000 UTC m=+161.149208455 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:09 crc kubenswrapper[4699]: I1122 04:10:09.407409 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:09 crc kubenswrapper[4699]: E1122 04:10:09.407772 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:10:09.907755486 +0000 UTC m=+161.250376673 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4f67" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:09 crc kubenswrapper[4699]: I1122 04:10:09.511033 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:10:09 crc kubenswrapper[4699]: E1122 04:10:09.511999 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:10:10.011978949 +0000 UTC m=+161.354600136 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:09 crc kubenswrapper[4699]: I1122 04:10:09.613865 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:09 crc kubenswrapper[4699]: E1122 04:10:09.614334 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:10:10.114318796 +0000 UTC m=+161.456939983 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4f67" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:09 crc kubenswrapper[4699]: I1122 04:10:09.667980 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fw85k"] Nov 22 04:10:09 crc kubenswrapper[4699]: I1122 04:10:09.670447 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fw85k" Nov 22 04:10:09 crc kubenswrapper[4699]: I1122 04:10:09.686108 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 22 04:10:09 crc kubenswrapper[4699]: I1122 04:10:09.715963 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:10:09 crc kubenswrapper[4699]: I1122 04:10:09.716586 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bshn5\" (UniqueName: \"kubernetes.io/projected/dc749a45-3392-4273-bcbd-8b637826a220-kube-api-access-bshn5\") pod \"redhat-marketplace-fw85k\" (UID: \"dc749a45-3392-4273-bcbd-8b637826a220\") " pod="openshift-marketplace/redhat-marketplace-fw85k" Nov 22 04:10:09 crc kubenswrapper[4699]: I1122 04:10:09.716627 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc749a45-3392-4273-bcbd-8b637826a220-catalog-content\") pod \"redhat-marketplace-fw85k\" (UID: \"dc749a45-3392-4273-bcbd-8b637826a220\") " pod="openshift-marketplace/redhat-marketplace-fw85k" Nov 22 04:10:09 crc kubenswrapper[4699]: I1122 04:10:09.716647 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc749a45-3392-4273-bcbd-8b637826a220-utilities\") pod \"redhat-marketplace-fw85k\" (UID: \"dc749a45-3392-4273-bcbd-8b637826a220\") " pod="openshift-marketplace/redhat-marketplace-fw85k" Nov 22 04:10:09 crc kubenswrapper[4699]: E1122 04:10:09.716859 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:10:10.216820916 +0000 UTC m=+161.559442103 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:09 crc kubenswrapper[4699]: I1122 04:10:09.726062 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fw85k"] Nov 22 04:10:09 crc kubenswrapper[4699]: W1122 04:10:09.808115 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b7c99b8_d587_4b74_a9ad_6f6b3d8bab45.slice/crio-44919da89f0a7edcb99098cd62f83826f3e242bc96a11d9d314f0422597e6189 WatchSource:0}: Error finding container 44919da89f0a7edcb99098cd62f83826f3e242bc96a11d9d314f0422597e6189: Status 404 returned error can't find the container with id 44919da89f0a7edcb99098cd62f83826f3e242bc96a11d9d314f0422597e6189 Nov 22 04:10:09 crc kubenswrapper[4699]: I1122 04:10:09.818342 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bshn5\" (UniqueName: \"kubernetes.io/projected/dc749a45-3392-4273-bcbd-8b637826a220-kube-api-access-bshn5\") pod \"redhat-marketplace-fw85k\" (UID: \"dc749a45-3392-4273-bcbd-8b637826a220\") " pod="openshift-marketplace/redhat-marketplace-fw85k" Nov 22 04:10:09 crc kubenswrapper[4699]: I1122 04:10:09.818416 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc749a45-3392-4273-bcbd-8b637826a220-utilities\") pod \"redhat-marketplace-fw85k\" (UID: \"dc749a45-3392-4273-bcbd-8b637826a220\") " pod="openshift-marketplace/redhat-marketplace-fw85k" Nov 22 04:10:09 crc kubenswrapper[4699]: I1122 04:10:09.818461 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc749a45-3392-4273-bcbd-8b637826a220-catalog-content\") pod \"redhat-marketplace-fw85k\" (UID: \"dc749a45-3392-4273-bcbd-8b637826a220\") " pod="openshift-marketplace/redhat-marketplace-fw85k" Nov 22 04:10:09 crc kubenswrapper[4699]: I1122 04:10:09.818510 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:09 crc kubenswrapper[4699]: E1122 04:10:09.818897 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:10:10.318879546 +0000 UTC m=+161.661500733 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4f67" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:09 crc kubenswrapper[4699]: I1122 04:10:09.819795 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc749a45-3392-4273-bcbd-8b637826a220-utilities\") pod \"redhat-marketplace-fw85k\" (UID: \"dc749a45-3392-4273-bcbd-8b637826a220\") " pod="openshift-marketplace/redhat-marketplace-fw85k" Nov 22 04:10:09 crc kubenswrapper[4699]: I1122 04:10:09.820195 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc749a45-3392-4273-bcbd-8b637826a220-catalog-content\") pod \"redhat-marketplace-fw85k\" (UID: \"dc749a45-3392-4273-bcbd-8b637826a220\") " pod="openshift-marketplace/redhat-marketplace-fw85k" Nov 22 04:10:09 crc kubenswrapper[4699]: I1122 04:10:09.831683 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8rqfc"] Nov 22 04:10:09 crc kubenswrapper[4699]: I1122 04:10:09.852624 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bshn5\" (UniqueName: \"kubernetes.io/projected/dc749a45-3392-4273-bcbd-8b637826a220-kube-api-access-bshn5\") pod \"redhat-marketplace-fw85k\" (UID: \"dc749a45-3392-4273-bcbd-8b637826a220\") " pod="openshift-marketplace/redhat-marketplace-fw85k" Nov 22 04:10:09 crc kubenswrapper[4699]: I1122 04:10:09.923742 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:10:09 crc kubenswrapper[4699]: E1122 04:10:09.924212 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:10:10.424179256 +0000 UTC m=+161.766800443 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:09 crc kubenswrapper[4699]: I1122 04:10:09.924368 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:09 crc kubenswrapper[4699]: E1122 04:10:09.924912 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:10:10.424905213 +0000 UTC m=+161.767526400 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4f67" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:09 crc kubenswrapper[4699]: I1122 04:10:09.985736 4699 patch_prober.go:28] interesting pod/router-default-5444994796-842kh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 04:10:09 crc kubenswrapper[4699]: [-]has-synced failed: reason withheld Nov 22 04:10:09 crc kubenswrapper[4699]: [+]process-running ok Nov 22 04:10:09 crc kubenswrapper[4699]: healthz check failed Nov 22 04:10:09 crc kubenswrapper[4699]: I1122 04:10:09.985843 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-842kh" podUID="40353ee4-6a92-4e39-be6f-b8249f523e36" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.026340 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:10:10 crc kubenswrapper[4699]: E1122 04:10:10.026567 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:10:10.526533553 +0000 UTC m=+161.869154750 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.029096 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:10 crc kubenswrapper[4699]: E1122 04:10:10.029594 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:10:10.529579787 +0000 UTC m=+161.872200974 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4f67" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.042235 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gj6mp"] Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.043411 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gj6mp" Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.050006 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gj6mp"] Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.059091 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fw85k" Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.131515 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:10:10 crc kubenswrapper[4699]: E1122 04:10:10.131747 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:10:10.631701419 +0000 UTC m=+161.974322606 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.131827 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb7b6485-6a90-42c3-a713-39a57596ed35-catalog-content\") pod \"redhat-marketplace-gj6mp\" (UID: \"fb7b6485-6a90-42c3-a713-39a57596ed35\") " pod="openshift-marketplace/redhat-marketplace-gj6mp" Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.131915 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.132010 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m94qf\" (UniqueName: \"kubernetes.io/projected/fb7b6485-6a90-42c3-a713-39a57596ed35-kube-api-access-m94qf\") pod \"redhat-marketplace-gj6mp\" (UID: \"fb7b6485-6a90-42c3-a713-39a57596ed35\") " pod="openshift-marketplace/redhat-marketplace-gj6mp" Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.132043 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb7b6485-6a90-42c3-a713-39a57596ed35-utilities\") pod \"redhat-marketplace-gj6mp\" (UID: \"fb7b6485-6a90-42c3-a713-39a57596ed35\") " pod="openshift-marketplace/redhat-marketplace-gj6mp" Nov 22 04:10:10 crc kubenswrapper[4699]: E1122 04:10:10.132575 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:10:10.63255457 +0000 UTC m=+161.975175767 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4f67" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.234699 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:10:10 crc kubenswrapper[4699]: E1122 04:10:10.234966 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:10:10.734937027 +0000 UTC m=+162.077558214 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.237276 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb7b6485-6a90-42c3-a713-39a57596ed35-catalog-content\") pod \"redhat-marketplace-gj6mp\" (UID: \"fb7b6485-6a90-42c3-a713-39a57596ed35\") " pod="openshift-marketplace/redhat-marketplace-gj6mp" Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.237342 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.237467 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m94qf\" (UniqueName: \"kubernetes.io/projected/fb7b6485-6a90-42c3-a713-39a57596ed35-kube-api-access-m94qf\") pod \"redhat-marketplace-gj6mp\" (UID: \"fb7b6485-6a90-42c3-a713-39a57596ed35\") " pod="openshift-marketplace/redhat-marketplace-gj6mp" Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.237506 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb7b6485-6a90-42c3-a713-39a57596ed35-utilities\") pod \"redhat-marketplace-gj6mp\" (UID: \"fb7b6485-6a90-42c3-a713-39a57596ed35\") " pod="openshift-marketplace/redhat-marketplace-gj6mp" Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.238026 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb7b6485-6a90-42c3-a713-39a57596ed35-utilities\") pod \"redhat-marketplace-gj6mp\" (UID: \"fb7b6485-6a90-42c3-a713-39a57596ed35\") " pod="openshift-marketplace/redhat-marketplace-gj6mp" Nov 22 04:10:10 crc kubenswrapper[4699]: E1122 04:10:10.238195 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:10:10.738170537 +0000 UTC m=+162.080791864 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4f67" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.238274 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb7b6485-6a90-42c3-a713-39a57596ed35-catalog-content\") pod \"redhat-marketplace-gj6mp\" (UID: \"fb7b6485-6a90-42c3-a713-39a57596ed35\") " pod="openshift-marketplace/redhat-marketplace-gj6mp" Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.252112 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-gmb69" event={"ID":"65188bec-6189-4789-9b29-f8241a81302e","Type":"ContainerStarted","Data":"5e24e4831f1527d0c40085cc15798afe99fc283407968054d71fce97c0743da6"} Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.252175 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-gmb69" event={"ID":"65188bec-6189-4789-9b29-f8241a81302e","Type":"ContainerStarted","Data":"3f98b0124683a1d4ec40c76cf5a745e27b3b6ba4efe5562cea4ee09d3b87d8b0"} Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.253989 4699 generic.go:334] "Generic (PLEG): container finished" podID="a8098591-7b9f-4330-90f0-4181570d05b3" containerID="96653f0a01f146e190b3a512f954078f8b6f50501d0d46839566d3eaa15e5b34" exitCode=0 Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.254054 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-cqt2c" event={"ID":"a8098591-7b9f-4330-90f0-4181570d05b3","Type":"ContainerDied","Data":"96653f0a01f146e190b3a512f954078f8b6f50501d0d46839566d3eaa15e5b34"} Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.264493 4699 generic.go:334] "Generic (PLEG): container finished" podID="444eee36-7eda-4b9c-9609-decdc3fb841b" containerID="803cf932fdf1ff2ae134e5894f055df72cf42dfe824b356506546454f43f309d" exitCode=0 Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.264585 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fd7rc" event={"ID":"444eee36-7eda-4b9c-9609-decdc3fb841b","Type":"ContainerDied","Data":"803cf932fdf1ff2ae134e5894f055df72cf42dfe824b356506546454f43f309d"} Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.276988 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m94qf\" (UniqueName: \"kubernetes.io/projected/fb7b6485-6a90-42c3-a713-39a57596ed35-kube-api-access-m94qf\") pod \"redhat-marketplace-gj6mp\" (UID: \"fb7b6485-6a90-42c3-a713-39a57596ed35\") " pod="openshift-marketplace/redhat-marketplace-gj6mp" Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.278236 4699 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.312943 4699 generic.go:334] "Generic (PLEG): container finished" podID="5e9094ed-d247-4427-86ce-adf048713377" containerID="8a771fe6cdb8d1754c51ba127afc7488bec05ba1b1764710a470a771f6149150" exitCode=0 Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.313014 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwc7g" event={"ID":"5e9094ed-d247-4427-86ce-adf048713377","Type":"ContainerDied","Data":"8a771fe6cdb8d1754c51ba127afc7488bec05ba1b1764710a470a771f6149150"} Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.342873 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:10:10 crc kubenswrapper[4699]: E1122 04:10:10.344590 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:10:10.844561433 +0000 UTC m=+162.187182630 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.360208 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.360300 4699 generic.go:334] "Generic (PLEG): container finished" podID="3394e2db-ddcc-4a0a-94d3-9336fabf5ca5" containerID="f4f418868f80920a74633b3fac64f0fa6b90a241c83f2e345347fc8c79a10384" exitCode=0 Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.361168 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bnvlx" event={"ID":"3394e2db-ddcc-4a0a-94d3-9336fabf5ca5","Type":"ContainerDied","Data":"f4f418868f80920a74633b3fac64f0fa6b90a241c83f2e345347fc8c79a10384"} Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.361295 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.363634 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.365612 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.374349 4699 generic.go:334] "Generic (PLEG): container finished" podID="7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45" containerID="ce86b39860c2e07921e0e92d7d7e83251c510b86b55876a6e072cdc7ee2911f5" exitCode=0 Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.374647 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rqfc" event={"ID":"7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45","Type":"ContainerDied","Data":"ce86b39860c2e07921e0e92d7d7e83251c510b86b55876a6e072cdc7ee2911f5"} Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.374705 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rqfc" event={"ID":"7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45","Type":"ContainerStarted","Data":"44919da89f0a7edcb99098cd62f83826f3e242bc96a11d9d314f0422597e6189"} Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.376882 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.410853 4699 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.445355 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/214d593e-dfbe-43d1-9108-d98f42ac104d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"214d593e-dfbe-43d1-9108-d98f42ac104d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.445421 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.445526 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/214d593e-dfbe-43d1-9108-d98f42ac104d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"214d593e-dfbe-43d1-9108-d98f42ac104d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 04:10:10 crc kubenswrapper[4699]: E1122 04:10:10.445960 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 04:10:10.945942586 +0000 UTC m=+162.288563843 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-q4f67" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.463645 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gj6mp" Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.549051 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.549802 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/214d593e-dfbe-43d1-9108-d98f42ac104d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"214d593e-dfbe-43d1-9108-d98f42ac104d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.549869 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/214d593e-dfbe-43d1-9108-d98f42ac104d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"214d593e-dfbe-43d1-9108-d98f42ac104d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 04:10:10 crc kubenswrapper[4699]: E1122 04:10:10.550337 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 04:10:11.050310713 +0000 UTC m=+162.392931900 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.550367 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/214d593e-dfbe-43d1-9108-d98f42ac104d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"214d593e-dfbe-43d1-9108-d98f42ac104d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.595468 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/214d593e-dfbe-43d1-9108-d98f42ac104d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"214d593e-dfbe-43d1-9108-d98f42ac104d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.634589 4699 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-11-22T04:10:10.410906375Z","Handler":null,"Name":""} Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.651367 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zbxdq"] Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.654455 4699 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.654631 4699 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.665596 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fw85k"] Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.667763 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.668044 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zbxdq" Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.672460 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.688895 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zbxdq"] Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.695775 4699 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.695840 4699 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.716350 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.769994 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ef45b9d-67cb-4257-869b-b6a643b49313-catalog-content\") pod \"redhat-operators-zbxdq\" (UID: \"7ef45b9d-67cb-4257-869b-b6a643b49313\") " pod="openshift-marketplace/redhat-operators-zbxdq" Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.770052 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ef45b9d-67cb-4257-869b-b6a643b49313-utilities\") pod \"redhat-operators-zbxdq\" (UID: \"7ef45b9d-67cb-4257-869b-b6a643b49313\") " pod="openshift-marketplace/redhat-operators-zbxdq" Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.770082 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwgd9\" (UniqueName: \"kubernetes.io/projected/7ef45b9d-67cb-4257-869b-b6a643b49313-kube-api-access-mwgd9\") pod \"redhat-operators-zbxdq\" (UID: \"7ef45b9d-67cb-4257-869b-b6a643b49313\") " pod="openshift-marketplace/redhat-operators-zbxdq" Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.807730 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-q4f67\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.872796 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.873221 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ef45b9d-67cb-4257-869b-b6a643b49313-catalog-content\") pod \"redhat-operators-zbxdq\" (UID: \"7ef45b9d-67cb-4257-869b-b6a643b49313\") " pod="openshift-marketplace/redhat-operators-zbxdq" Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.873286 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ef45b9d-67cb-4257-869b-b6a643b49313-utilities\") pod \"redhat-operators-zbxdq\" (UID: \"7ef45b9d-67cb-4257-869b-b6a643b49313\") " pod="openshift-marketplace/redhat-operators-zbxdq" Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.873314 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwgd9\" (UniqueName: \"kubernetes.io/projected/7ef45b9d-67cb-4257-869b-b6a643b49313-kube-api-access-mwgd9\") pod \"redhat-operators-zbxdq\" (UID: \"7ef45b9d-67cb-4257-869b-b6a643b49313\") " pod="openshift-marketplace/redhat-operators-zbxdq" Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.874127 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ef45b9d-67cb-4257-869b-b6a643b49313-catalog-content\") pod \"redhat-operators-zbxdq\" (UID: \"7ef45b9d-67cb-4257-869b-b6a643b49313\") " pod="openshift-marketplace/redhat-operators-zbxdq" Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.874584 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ef45b9d-67cb-4257-869b-b6a643b49313-utilities\") pod \"redhat-operators-zbxdq\" (UID: \"7ef45b9d-67cb-4257-869b-b6a643b49313\") " pod="openshift-marketplace/redhat-operators-zbxdq" Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.903325 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.905958 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwgd9\" (UniqueName: \"kubernetes.io/projected/7ef45b9d-67cb-4257-869b-b6a643b49313-kube-api-access-mwgd9\") pod \"redhat-operators-zbxdq\" (UID: \"7ef45b9d-67cb-4257-869b-b6a643b49313\") " pod="openshift-marketplace/redhat-operators-zbxdq" Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.982198 4699 patch_prober.go:28] interesting pod/router-default-5444994796-842kh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 04:10:10 crc kubenswrapper[4699]: [-]has-synced failed: reason withheld Nov 22 04:10:10 crc kubenswrapper[4699]: [+]process-running ok Nov 22 04:10:10 crc kubenswrapper[4699]: healthz check failed Nov 22 04:10:10 crc kubenswrapper[4699]: I1122 04:10:10.982293 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-842kh" podUID="40353ee4-6a92-4e39-be6f-b8249f523e36" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 04:10:11 crc kubenswrapper[4699]: I1122 04:10:11.030166 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gj6mp"] Nov 22 04:10:11 crc kubenswrapper[4699]: I1122 04:10:11.034813 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5qk9l"] Nov 22 04:10:11 crc kubenswrapper[4699]: I1122 04:10:11.040401 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zbxdq" Nov 22 04:10:11 crc kubenswrapper[4699]: I1122 04:10:11.044748 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5qk9l" Nov 22 04:10:11 crc kubenswrapper[4699]: I1122 04:10:11.052289 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5qk9l"] Nov 22 04:10:11 crc kubenswrapper[4699]: I1122 04:10:11.059774 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:11 crc kubenswrapper[4699]: I1122 04:10:11.164799 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 22 04:10:11 crc kubenswrapper[4699]: I1122 04:10:11.183799 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ebb453c-017c-43ff-adae-97a1e95903f2-utilities\") pod \"redhat-operators-5qk9l\" (UID: \"0ebb453c-017c-43ff-adae-97a1e95903f2\") " pod="openshift-marketplace/redhat-operators-5qk9l" Nov 22 04:10:11 crc kubenswrapper[4699]: I1122 04:10:11.183871 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ebb453c-017c-43ff-adae-97a1e95903f2-catalog-content\") pod \"redhat-operators-5qk9l\" (UID: \"0ebb453c-017c-43ff-adae-97a1e95903f2\") " pod="openshift-marketplace/redhat-operators-5qk9l" Nov 22 04:10:11 crc kubenswrapper[4699]: I1122 04:10:11.183918 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s6l8\" (UniqueName: \"kubernetes.io/projected/0ebb453c-017c-43ff-adae-97a1e95903f2-kube-api-access-5s6l8\") pod \"redhat-operators-5qk9l\" (UID: \"0ebb453c-017c-43ff-adae-97a1e95903f2\") " pod="openshift-marketplace/redhat-operators-5qk9l" Nov 22 04:10:11 crc kubenswrapper[4699]: I1122 04:10:11.261055 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-mdzbj" Nov 22 04:10:11 crc kubenswrapper[4699]: I1122 04:10:11.286192 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ebb453c-017c-43ff-adae-97a1e95903f2-utilities\") pod \"redhat-operators-5qk9l\" (UID: \"0ebb453c-017c-43ff-adae-97a1e95903f2\") " pod="openshift-marketplace/redhat-operators-5qk9l" Nov 22 04:10:11 crc kubenswrapper[4699]: I1122 04:10:11.286241 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ebb453c-017c-43ff-adae-97a1e95903f2-catalog-content\") pod \"redhat-operators-5qk9l\" (UID: \"0ebb453c-017c-43ff-adae-97a1e95903f2\") " pod="openshift-marketplace/redhat-operators-5qk9l" Nov 22 04:10:11 crc kubenswrapper[4699]: I1122 04:10:11.286261 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s6l8\" (UniqueName: \"kubernetes.io/projected/0ebb453c-017c-43ff-adae-97a1e95903f2-kube-api-access-5s6l8\") pod \"redhat-operators-5qk9l\" (UID: \"0ebb453c-017c-43ff-adae-97a1e95903f2\") " pod="openshift-marketplace/redhat-operators-5qk9l" Nov 22 04:10:11 crc kubenswrapper[4699]: I1122 04:10:11.286772 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ebb453c-017c-43ff-adae-97a1e95903f2-utilities\") pod \"redhat-operators-5qk9l\" (UID: \"0ebb453c-017c-43ff-adae-97a1e95903f2\") " pod="openshift-marketplace/redhat-operators-5qk9l" Nov 22 04:10:11 crc kubenswrapper[4699]: I1122 04:10:11.287007 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ebb453c-017c-43ff-adae-97a1e95903f2-catalog-content\") pod \"redhat-operators-5qk9l\" (UID: \"0ebb453c-017c-43ff-adae-97a1e95903f2\") " pod="openshift-marketplace/redhat-operators-5qk9l" Nov 22 04:10:11 crc kubenswrapper[4699]: I1122 04:10:11.318063 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s6l8\" (UniqueName: \"kubernetes.io/projected/0ebb453c-017c-43ff-adae-97a1e95903f2-kube-api-access-5s6l8\") pod \"redhat-operators-5qk9l\" (UID: \"0ebb453c-017c-43ff-adae-97a1e95903f2\") " pod="openshift-marketplace/redhat-operators-5qk9l" Nov 22 04:10:11 crc kubenswrapper[4699]: I1122 04:10:11.424819 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5qk9l" Nov 22 04:10:11 crc kubenswrapper[4699]: I1122 04:10:11.425653 4699 generic.go:334] "Generic (PLEG): container finished" podID="dc749a45-3392-4273-bcbd-8b637826a220" containerID="092a5bed6b16cbc8225b8acb95929067ae78ceb1913b4e3ec8bf379386318d96" exitCode=0 Nov 22 04:10:11 crc kubenswrapper[4699]: I1122 04:10:11.425770 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fw85k" event={"ID":"dc749a45-3392-4273-bcbd-8b637826a220","Type":"ContainerDied","Data":"092a5bed6b16cbc8225b8acb95929067ae78ceb1913b4e3ec8bf379386318d96"} Nov 22 04:10:11 crc kubenswrapper[4699]: I1122 04:10:11.425811 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fw85k" event={"ID":"dc749a45-3392-4273-bcbd-8b637826a220","Type":"ContainerStarted","Data":"d1adb9f6c467103c566f9381011e766bda5b14426aec46d1d617ae20e138ee99"} Nov 22 04:10:11 crc kubenswrapper[4699]: I1122 04:10:11.602987 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Nov 22 04:10:11 crc kubenswrapper[4699]: I1122 04:10:11.603877 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"214d593e-dfbe-43d1-9108-d98f42ac104d","Type":"ContainerStarted","Data":"b76b070a5f9664f4b5a592bf1c652dc8c1de6dafa373854b1edfe2e6fd0b45f0"} Nov 22 04:10:11 crc kubenswrapper[4699]: I1122 04:10:11.603921 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gj6mp" event={"ID":"fb7b6485-6a90-42c3-a713-39a57596ed35","Type":"ContainerStarted","Data":"2dd0c4130eeaef6a14369a2fa056bc85ed60d55cbfac37ec73dbfeca734b9846"} Nov 22 04:10:11 crc kubenswrapper[4699]: I1122 04:10:11.607900 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-gmb69" event={"ID":"65188bec-6189-4789-9b29-f8241a81302e","Type":"ContainerStarted","Data":"c2d6b5d084c7665ac9470602d3ffb385303934da8e7bdeeed6e755682c5f0795"} Nov 22 04:10:11 crc kubenswrapper[4699]: I1122 04:10:11.641244 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-gmb69" podStartSLOduration=12.64122628 podStartE2EDuration="12.64122628s" podCreationTimestamp="2025-11-22 04:09:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:10:11.637238452 +0000 UTC m=+162.979859639" watchObservedRunningTime="2025-11-22 04:10:11.64122628 +0000 UTC m=+162.983847457" Nov 22 04:10:11 crc kubenswrapper[4699]: I1122 04:10:11.764505 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 22 04:10:11 crc kubenswrapper[4699]: I1122 04:10:11.765554 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 04:10:11 crc kubenswrapper[4699]: I1122 04:10:11.772667 4699 patch_prober.go:28] interesting pod/downloads-7954f5f757-2gxkc container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Nov 22 04:10:11 crc kubenswrapper[4699]: I1122 04:10:11.772729 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2gxkc" podUID="854975d3-e251-4354-a2c8-84ea7da296a8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Nov 22 04:10:11 crc kubenswrapper[4699]: I1122 04:10:11.772667 4699 patch_prober.go:28] interesting pod/downloads-7954f5f757-2gxkc container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Nov 22 04:10:11 crc kubenswrapper[4699]: I1122 04:10:11.773121 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-2gxkc" podUID="854975d3-e251-4354-a2c8-84ea7da296a8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Nov 22 04:10:11 crc kubenswrapper[4699]: I1122 04:10:11.774322 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 22 04:10:11 crc kubenswrapper[4699]: I1122 04:10:11.774687 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 22 04:10:11 crc kubenswrapper[4699]: I1122 04:10:11.778514 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 22 04:10:11 crc kubenswrapper[4699]: I1122 04:10:11.833076 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-9sbmb" Nov 22 04:10:11 crc kubenswrapper[4699]: I1122 04:10:11.834401 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-9sbmb" Nov 22 04:10:11 crc kubenswrapper[4699]: I1122 04:10:11.880729 4699 patch_prober.go:28] interesting pod/console-f9d7485db-9sbmb container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.34:8443/health\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Nov 22 04:10:11 crc kubenswrapper[4699]: I1122 04:10:11.880802 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-9sbmb" podUID="c108dbbb-24af-45d9-a01f-cadab889f225" containerName="console" probeResult="failure" output="Get \"https://10.217.0.34:8443/health\": dial tcp 10.217.0.34:8443: connect: connection refused" Nov 22 04:10:11 crc kubenswrapper[4699]: I1122 04:10:11.923536 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zbxdq"] Nov 22 04:10:11 crc kubenswrapper[4699]: W1122 04:10:11.933722 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ef45b9d_67cb_4257_869b_b6a643b49313.slice/crio-5a45fac6a1b3836852d4359096ad5b6ca29cd431001d82c30009bfd439291517 WatchSource:0}: Error finding container 5a45fac6a1b3836852d4359096ad5b6ca29cd431001d82c30009bfd439291517: Status 404 returned error can't find the container with id 5a45fac6a1b3836852d4359096ad5b6ca29cd431001d82c30009bfd439291517 Nov 22 04:10:11 crc kubenswrapper[4699]: I1122 04:10:11.945590 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc5d1c0d-e8ff-4b93-be90-48a68ea750b9-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"bc5d1c0d-e8ff-4b93-be90-48a68ea750b9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 04:10:11 crc kubenswrapper[4699]: I1122 04:10:11.945774 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bc5d1c0d-e8ff-4b93-be90-48a68ea750b9-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"bc5d1c0d-e8ff-4b93-be90-48a68ea750b9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 04:10:11 crc kubenswrapper[4699]: I1122 04:10:11.981031 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-842kh" Nov 22 04:10:11 crc kubenswrapper[4699]: I1122 04:10:11.997614 4699 patch_prober.go:28] interesting pod/router-default-5444994796-842kh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 04:10:11 crc kubenswrapper[4699]: [-]has-synced failed: reason withheld Nov 22 04:10:11 crc kubenswrapper[4699]: [+]process-running ok Nov 22 04:10:11 crc kubenswrapper[4699]: healthz check failed Nov 22 04:10:11 crc kubenswrapper[4699]: I1122 04:10:11.997679 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-842kh" podUID="40353ee4-6a92-4e39-be6f-b8249f523e36" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 04:10:12 crc kubenswrapper[4699]: I1122 04:10:12.024144 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8v9k5" Nov 22 04:10:12 crc kubenswrapper[4699]: I1122 04:10:12.031118 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-8v9k5" Nov 22 04:10:12 crc kubenswrapper[4699]: I1122 04:10:12.049687 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bc5d1c0d-e8ff-4b93-be90-48a68ea750b9-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"bc5d1c0d-e8ff-4b93-be90-48a68ea750b9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 04:10:12 crc kubenswrapper[4699]: I1122 04:10:12.049759 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc5d1c0d-e8ff-4b93-be90-48a68ea750b9-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"bc5d1c0d-e8ff-4b93-be90-48a68ea750b9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 04:10:12 crc kubenswrapper[4699]: I1122 04:10:12.049823 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc5d1c0d-e8ff-4b93-be90-48a68ea750b9-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"bc5d1c0d-e8ff-4b93-be90-48a68ea750b9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 04:10:12 crc kubenswrapper[4699]: I1122 04:10:12.102370 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bc5d1c0d-e8ff-4b93-be90-48a68ea750b9-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"bc5d1c0d-e8ff-4b93-be90-48a68ea750b9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 04:10:12 crc kubenswrapper[4699]: I1122 04:10:12.150608 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 04:10:12 crc kubenswrapper[4699]: I1122 04:10:12.288876 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-cqt2c" Nov 22 04:10:12 crc kubenswrapper[4699]: I1122 04:10:12.292538 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5qk9l"] Nov 22 04:10:12 crc kubenswrapper[4699]: I1122 04:10:12.297337 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-q4f67"] Nov 22 04:10:12 crc kubenswrapper[4699]: I1122 04:10:12.299937 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-c655t" Nov 22 04:10:12 crc kubenswrapper[4699]: I1122 04:10:12.299969 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-c655t" Nov 22 04:10:12 crc kubenswrapper[4699]: I1122 04:10:12.316697 4699 patch_prober.go:28] interesting pod/apiserver-76f77b778f-c655t container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Nov 22 04:10:12 crc kubenswrapper[4699]: [+]log ok Nov 22 04:10:12 crc kubenswrapper[4699]: [+]etcd ok Nov 22 04:10:12 crc kubenswrapper[4699]: [+]poststarthook/start-apiserver-admission-initializer ok Nov 22 04:10:12 crc kubenswrapper[4699]: [+]poststarthook/generic-apiserver-start-informers ok Nov 22 04:10:12 crc kubenswrapper[4699]: [+]poststarthook/max-in-flight-filter ok Nov 22 04:10:12 crc kubenswrapper[4699]: [+]poststarthook/storage-object-count-tracker-hook ok Nov 22 04:10:12 crc kubenswrapper[4699]: [+]poststarthook/image.openshift.io-apiserver-caches ok Nov 22 04:10:12 crc kubenswrapper[4699]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Nov 22 04:10:12 crc kubenswrapper[4699]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Nov 22 04:10:12 crc kubenswrapper[4699]: [+]poststarthook/project.openshift.io-projectcache ok Nov 22 04:10:12 crc kubenswrapper[4699]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Nov 22 04:10:12 crc kubenswrapper[4699]: [+]poststarthook/openshift.io-startinformers ok Nov 22 04:10:12 crc kubenswrapper[4699]: [+]poststarthook/openshift.io-restmapperupdater ok Nov 22 04:10:12 crc kubenswrapper[4699]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Nov 22 04:10:12 crc kubenswrapper[4699]: livez check failed Nov 22 04:10:12 crc kubenswrapper[4699]: I1122 04:10:12.316826 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-c655t" podUID="85bf6018-799b-4301-9b19-f68749c028b4" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 04:10:12 crc kubenswrapper[4699]: I1122 04:10:12.464701 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8098591-7b9f-4330-90f0-4181570d05b3-config-volume\") pod \"a8098591-7b9f-4330-90f0-4181570d05b3\" (UID: \"a8098591-7b9f-4330-90f0-4181570d05b3\") " Nov 22 04:10:12 crc kubenswrapper[4699]: I1122 04:10:12.465088 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a8098591-7b9f-4330-90f0-4181570d05b3-secret-volume\") pod \"a8098591-7b9f-4330-90f0-4181570d05b3\" (UID: \"a8098591-7b9f-4330-90f0-4181570d05b3\") " Nov 22 04:10:12 crc kubenswrapper[4699]: I1122 04:10:12.465185 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9w2mf\" (UniqueName: \"kubernetes.io/projected/a8098591-7b9f-4330-90f0-4181570d05b3-kube-api-access-9w2mf\") pod \"a8098591-7b9f-4330-90f0-4181570d05b3\" (UID: \"a8098591-7b9f-4330-90f0-4181570d05b3\") " Nov 22 04:10:12 crc kubenswrapper[4699]: I1122 04:10:12.466173 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8098591-7b9f-4330-90f0-4181570d05b3-config-volume" (OuterVolumeSpecName: "config-volume") pod "a8098591-7b9f-4330-90f0-4181570d05b3" (UID: "a8098591-7b9f-4330-90f0-4181570d05b3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:10:12 crc kubenswrapper[4699]: I1122 04:10:12.490374 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8098591-7b9f-4330-90f0-4181570d05b3-kube-api-access-9w2mf" (OuterVolumeSpecName: "kube-api-access-9w2mf") pod "a8098591-7b9f-4330-90f0-4181570d05b3" (UID: "a8098591-7b9f-4330-90f0-4181570d05b3"). InnerVolumeSpecName "kube-api-access-9w2mf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:10:12 crc kubenswrapper[4699]: I1122 04:10:12.491893 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8098591-7b9f-4330-90f0-4181570d05b3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a8098591-7b9f-4330-90f0-4181570d05b3" (UID: "a8098591-7b9f-4330-90f0-4181570d05b3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:10:12 crc kubenswrapper[4699]: I1122 04:10:12.567608 4699 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8098591-7b9f-4330-90f0-4181570d05b3-config-volume\") on node \"crc\" DevicePath \"\"" Nov 22 04:10:12 crc kubenswrapper[4699]: I1122 04:10:12.567649 4699 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a8098591-7b9f-4330-90f0-4181570d05b3-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 22 04:10:12 crc kubenswrapper[4699]: I1122 04:10:12.567663 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9w2mf\" (UniqueName: \"kubernetes.io/projected/a8098591-7b9f-4330-90f0-4181570d05b3-kube-api-access-9w2mf\") on node \"crc\" DevicePath \"\"" Nov 22 04:10:12 crc kubenswrapper[4699]: I1122 04:10:12.631679 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"214d593e-dfbe-43d1-9108-d98f42ac104d","Type":"ContainerStarted","Data":"c63fe7ad0f26929b53b73cd7ab9eeca0af1ffcb63813eef43ef642ba8b8445c6"} Nov 22 04:10:12 crc kubenswrapper[4699]: I1122 04:10:12.636422 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" event={"ID":"5712615f-2791-42fe-9a50-3dafe99495a0","Type":"ContainerStarted","Data":"859a9bdf234e05f2bd5b36b9c57468d2d49d48e098c6727ac1da60961db49b88"} Nov 22 04:10:12 crc kubenswrapper[4699]: I1122 04:10:12.644742 4699 generic.go:334] "Generic (PLEG): container finished" podID="fb7b6485-6a90-42c3-a713-39a57596ed35" containerID="88aa39ca2220dc8553262350e12c740c1b3132dc34967741446adf943d6188f5" exitCode=0 Nov 22 04:10:12 crc kubenswrapper[4699]: I1122 04:10:12.644892 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gj6mp" event={"ID":"fb7b6485-6a90-42c3-a713-39a57596ed35","Type":"ContainerDied","Data":"88aa39ca2220dc8553262350e12c740c1b3132dc34967741446adf943d6188f5"} Nov 22 04:10:12 crc kubenswrapper[4699]: I1122 04:10:12.662585 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.662554634 podStartE2EDuration="2.662554634s" podCreationTimestamp="2025-11-22 04:10:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:10:12.654576758 +0000 UTC m=+163.997197965" watchObservedRunningTime="2025-11-22 04:10:12.662554634 +0000 UTC m=+164.005175821" Nov 22 04:10:12 crc kubenswrapper[4699]: I1122 04:10:12.668104 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qk9l" event={"ID":"0ebb453c-017c-43ff-adae-97a1e95903f2","Type":"ContainerStarted","Data":"431a13b2392f84339794eaf36fc57e7debda2526e026bfa5932f80dc23766cfa"} Nov 22 04:10:12 crc kubenswrapper[4699]: I1122 04:10:12.685621 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-cqt2c" event={"ID":"a8098591-7b9f-4330-90f0-4181570d05b3","Type":"ContainerDied","Data":"a81aa3bae081beb27517e7ba028e320ee60d61752075988f92c4142964a738c8"} Nov 22 04:10:12 crc kubenswrapper[4699]: I1122 04:10:12.685696 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a81aa3bae081beb27517e7ba028e320ee60d61752075988f92c4142964a738c8" Nov 22 04:10:12 crc kubenswrapper[4699]: I1122 04:10:12.685813 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396400-cqt2c" Nov 22 04:10:12 crc kubenswrapper[4699]: I1122 04:10:12.689660 4699 generic.go:334] "Generic (PLEG): container finished" podID="7ef45b9d-67cb-4257-869b-b6a643b49313" containerID="0e3d31650faf30c90051c221607a8f18afa9ca91a40adfbe8a82ad426ab564c6" exitCode=0 Nov 22 04:10:12 crc kubenswrapper[4699]: I1122 04:10:12.690967 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zbxdq" event={"ID":"7ef45b9d-67cb-4257-869b-b6a643b49313","Type":"ContainerDied","Data":"0e3d31650faf30c90051c221607a8f18afa9ca91a40adfbe8a82ad426ab564c6"} Nov 22 04:10:12 crc kubenswrapper[4699]: I1122 04:10:12.691012 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zbxdq" event={"ID":"7ef45b9d-67cb-4257-869b-b6a643b49313","Type":"ContainerStarted","Data":"5a45fac6a1b3836852d4359096ad5b6ca29cd431001d82c30009bfd439291517"} Nov 22 04:10:12 crc kubenswrapper[4699]: I1122 04:10:12.790380 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 22 04:10:12 crc kubenswrapper[4699]: W1122 04:10:12.849590 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podbc5d1c0d_e8ff_4b93_be90_48a68ea750b9.slice/crio-bd655e6dadf965e623be44b94c6fc3b2927c85ee2e97ce1c163626f1818cfb11 WatchSource:0}: Error finding container bd655e6dadf965e623be44b94c6fc3b2927c85ee2e97ce1c163626f1818cfb11: Status 404 returned error can't find the container with id bd655e6dadf965e623be44b94c6fc3b2927c85ee2e97ce1c163626f1818cfb11 Nov 22 04:10:12 crc kubenswrapper[4699]: I1122 04:10:12.985843 4699 patch_prober.go:28] interesting pod/router-default-5444994796-842kh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 04:10:12 crc kubenswrapper[4699]: [-]has-synced failed: reason withheld Nov 22 04:10:12 crc kubenswrapper[4699]: [+]process-running ok Nov 22 04:10:12 crc kubenswrapper[4699]: healthz check failed Nov 22 04:10:12 crc kubenswrapper[4699]: I1122 04:10:12.985931 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-842kh" podUID="40353ee4-6a92-4e39-be6f-b8249f523e36" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 04:10:13 crc kubenswrapper[4699]: I1122 04:10:13.387346 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/82be5d0c-6f95-43e4-aa3c-9c56de3e200c-metrics-certs\") pod \"network-metrics-daemon-pj52w\" (UID: \"82be5d0c-6f95-43e4-aa3c-9c56de3e200c\") " pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:10:13 crc kubenswrapper[4699]: I1122 04:10:13.399649 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/82be5d0c-6f95-43e4-aa3c-9c56de3e200c-metrics-certs\") pod \"network-metrics-daemon-pj52w\" (UID: \"82be5d0c-6f95-43e4-aa3c-9c56de3e200c\") " pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:10:13 crc kubenswrapper[4699]: I1122 04:10:13.670335 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pj52w" Nov 22 04:10:13 crc kubenswrapper[4699]: I1122 04:10:13.702638 4699 generic.go:334] "Generic (PLEG): container finished" podID="0ebb453c-017c-43ff-adae-97a1e95903f2" containerID="ac73fdcd70ebb9b13a51614dd0e3f9c1871574554734f8dccdc80adf269ceba9" exitCode=0 Nov 22 04:10:13 crc kubenswrapper[4699]: I1122 04:10:13.702723 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qk9l" event={"ID":"0ebb453c-017c-43ff-adae-97a1e95903f2","Type":"ContainerDied","Data":"ac73fdcd70ebb9b13a51614dd0e3f9c1871574554734f8dccdc80adf269ceba9"} Nov 22 04:10:13 crc kubenswrapper[4699]: I1122 04:10:13.706001 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"bc5d1c0d-e8ff-4b93-be90-48a68ea750b9","Type":"ContainerStarted","Data":"bd655e6dadf965e623be44b94c6fc3b2927c85ee2e97ce1c163626f1818cfb11"} Nov 22 04:10:13 crc kubenswrapper[4699]: I1122 04:10:13.725328 4699 generic.go:334] "Generic (PLEG): container finished" podID="214d593e-dfbe-43d1-9108-d98f42ac104d" containerID="c63fe7ad0f26929b53b73cd7ab9eeca0af1ffcb63813eef43ef642ba8b8445c6" exitCode=0 Nov 22 04:10:13 crc kubenswrapper[4699]: I1122 04:10:13.725451 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"214d593e-dfbe-43d1-9108-d98f42ac104d","Type":"ContainerDied","Data":"c63fe7ad0f26929b53b73cd7ab9eeca0af1ffcb63813eef43ef642ba8b8445c6"} Nov 22 04:10:13 crc kubenswrapper[4699]: I1122 04:10:13.732704 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" event={"ID":"5712615f-2791-42fe-9a50-3dafe99495a0","Type":"ContainerStarted","Data":"1047d86cc6194d44702bb5e82baf7e0b82ed87e5aad8a657b7035f6fdc7bf068"} Nov 22 04:10:13 crc kubenswrapper[4699]: I1122 04:10:13.733760 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:13 crc kubenswrapper[4699]: I1122 04:10:13.982986 4699 patch_prober.go:28] interesting pod/router-default-5444994796-842kh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 04:10:13 crc kubenswrapper[4699]: [-]has-synced failed: reason withheld Nov 22 04:10:13 crc kubenswrapper[4699]: [+]process-running ok Nov 22 04:10:13 crc kubenswrapper[4699]: healthz check failed Nov 22 04:10:13 crc kubenswrapper[4699]: I1122 04:10:13.983415 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-842kh" podUID="40353ee4-6a92-4e39-be6f-b8249f523e36" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 04:10:14 crc kubenswrapper[4699]: I1122 04:10:14.144894 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" podStartSLOduration=143.144875716 podStartE2EDuration="2m23.144875716s" podCreationTimestamp="2025-11-22 04:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:10:13.768954502 +0000 UTC m=+165.111575699" watchObservedRunningTime="2025-11-22 04:10:14.144875716 +0000 UTC m=+165.487496903" Nov 22 04:10:14 crc kubenswrapper[4699]: I1122 04:10:14.145919 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-pj52w"] Nov 22 04:10:14 crc kubenswrapper[4699]: I1122 04:10:14.744891 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"bc5d1c0d-e8ff-4b93-be90-48a68ea750b9","Type":"ContainerStarted","Data":"da26756518b8fcce6032cb727628c9c80e9a7ae95fd73f41d634d894b8a896ba"} Nov 22 04:10:14 crc kubenswrapper[4699]: I1122 04:10:14.755210 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-pj52w" event={"ID":"82be5d0c-6f95-43e4-aa3c-9c56de3e200c","Type":"ContainerStarted","Data":"c5acb1cc49b4f0a2afad5317cd136ed389ba56751cf57216089e0c8cadedf383"} Nov 22 04:10:14 crc kubenswrapper[4699]: I1122 04:10:14.764359 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.764333969 podStartE2EDuration="3.764333969s" podCreationTimestamp="2025-11-22 04:10:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:10:14.764019492 +0000 UTC m=+166.106640689" watchObservedRunningTime="2025-11-22 04:10:14.764333969 +0000 UTC m=+166.106955166" Nov 22 04:10:14 crc kubenswrapper[4699]: I1122 04:10:14.980778 4699 patch_prober.go:28] interesting pod/router-default-5444994796-842kh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 04:10:14 crc kubenswrapper[4699]: [-]has-synced failed: reason withheld Nov 22 04:10:14 crc kubenswrapper[4699]: [+]process-running ok Nov 22 04:10:14 crc kubenswrapper[4699]: healthz check failed Nov 22 04:10:14 crc kubenswrapper[4699]: I1122 04:10:14.981068 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-842kh" podUID="40353ee4-6a92-4e39-be6f-b8249f523e36" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 04:10:15 crc kubenswrapper[4699]: I1122 04:10:15.098688 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:10:15 crc kubenswrapper[4699]: I1122 04:10:15.247803 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 04:10:15 crc kubenswrapper[4699]: I1122 04:10:15.423588 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/214d593e-dfbe-43d1-9108-d98f42ac104d-kube-api-access\") pod \"214d593e-dfbe-43d1-9108-d98f42ac104d\" (UID: \"214d593e-dfbe-43d1-9108-d98f42ac104d\") " Nov 22 04:10:15 crc kubenswrapper[4699]: I1122 04:10:15.423705 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/214d593e-dfbe-43d1-9108-d98f42ac104d-kubelet-dir\") pod \"214d593e-dfbe-43d1-9108-d98f42ac104d\" (UID: \"214d593e-dfbe-43d1-9108-d98f42ac104d\") " Nov 22 04:10:15 crc kubenswrapper[4699]: I1122 04:10:15.423967 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/214d593e-dfbe-43d1-9108-d98f42ac104d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "214d593e-dfbe-43d1-9108-d98f42ac104d" (UID: "214d593e-dfbe-43d1-9108-d98f42ac104d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:10:15 crc kubenswrapper[4699]: I1122 04:10:15.429890 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/214d593e-dfbe-43d1-9108-d98f42ac104d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "214d593e-dfbe-43d1-9108-d98f42ac104d" (UID: "214d593e-dfbe-43d1-9108-d98f42ac104d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:10:15 crc kubenswrapper[4699]: I1122 04:10:15.526864 4699 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/214d593e-dfbe-43d1-9108-d98f42ac104d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 22 04:10:15 crc kubenswrapper[4699]: I1122 04:10:15.526902 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/214d593e-dfbe-43d1-9108-d98f42ac104d-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 22 04:10:15 crc kubenswrapper[4699]: I1122 04:10:15.780993 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-pj52w" event={"ID":"82be5d0c-6f95-43e4-aa3c-9c56de3e200c","Type":"ContainerStarted","Data":"67bd30d74efad7d77f07bdf67dd86d9d535d5bf3e4eb3819ec689fb3d1cfc40d"} Nov 22 04:10:15 crc kubenswrapper[4699]: I1122 04:10:15.787229 4699 generic.go:334] "Generic (PLEG): container finished" podID="bc5d1c0d-e8ff-4b93-be90-48a68ea750b9" containerID="da26756518b8fcce6032cb727628c9c80e9a7ae95fd73f41d634d894b8a896ba" exitCode=0 Nov 22 04:10:15 crc kubenswrapper[4699]: I1122 04:10:15.787378 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"bc5d1c0d-e8ff-4b93-be90-48a68ea750b9","Type":"ContainerDied","Data":"da26756518b8fcce6032cb727628c9c80e9a7ae95fd73f41d634d894b8a896ba"} Nov 22 04:10:15 crc kubenswrapper[4699]: I1122 04:10:15.799108 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 04:10:15 crc kubenswrapper[4699]: I1122 04:10:15.799798 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"214d593e-dfbe-43d1-9108-d98f42ac104d","Type":"ContainerDied","Data":"b76b070a5f9664f4b5a592bf1c652dc8c1de6dafa373854b1edfe2e6fd0b45f0"} Nov 22 04:10:15 crc kubenswrapper[4699]: I1122 04:10:15.799887 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b76b070a5f9664f4b5a592bf1c652dc8c1de6dafa373854b1edfe2e6fd0b45f0" Nov 22 04:10:15 crc kubenswrapper[4699]: I1122 04:10:15.983379 4699 patch_prober.go:28] interesting pod/router-default-5444994796-842kh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 04:10:15 crc kubenswrapper[4699]: [-]has-synced failed: reason withheld Nov 22 04:10:15 crc kubenswrapper[4699]: [+]process-running ok Nov 22 04:10:15 crc kubenswrapper[4699]: healthz check failed Nov 22 04:10:15 crc kubenswrapper[4699]: I1122 04:10:15.983498 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-842kh" podUID="40353ee4-6a92-4e39-be6f-b8249f523e36" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 04:10:16 crc kubenswrapper[4699]: I1122 04:10:16.813947 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-pj52w" event={"ID":"82be5d0c-6f95-43e4-aa3c-9c56de3e200c","Type":"ContainerStarted","Data":"ecf7449f017e08704cebecdff6ae3e1bc2ee45c44f6935661dc6a94ed3694dc1"} Nov 22 04:10:16 crc kubenswrapper[4699]: I1122 04:10:16.854018 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-pj52w" podStartSLOduration=145.853993835 podStartE2EDuration="2m25.853993835s" podCreationTimestamp="2025-11-22 04:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:10:16.847815453 +0000 UTC m=+168.190436640" watchObservedRunningTime="2025-11-22 04:10:16.853993835 +0000 UTC m=+168.196615022" Nov 22 04:10:16 crc kubenswrapper[4699]: I1122 04:10:16.984683 4699 patch_prober.go:28] interesting pod/router-default-5444994796-842kh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 04:10:16 crc kubenswrapper[4699]: [-]has-synced failed: reason withheld Nov 22 04:10:16 crc kubenswrapper[4699]: [+]process-running ok Nov 22 04:10:16 crc kubenswrapper[4699]: healthz check failed Nov 22 04:10:16 crc kubenswrapper[4699]: I1122 04:10:16.984760 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-842kh" podUID="40353ee4-6a92-4e39-be6f-b8249f523e36" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 04:10:17 crc kubenswrapper[4699]: I1122 04:10:17.118679 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 04:10:17 crc kubenswrapper[4699]: I1122 04:10:17.271555 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc5d1c0d-e8ff-4b93-be90-48a68ea750b9-kubelet-dir\") pod \"bc5d1c0d-e8ff-4b93-be90-48a68ea750b9\" (UID: \"bc5d1c0d-e8ff-4b93-be90-48a68ea750b9\") " Nov 22 04:10:17 crc kubenswrapper[4699]: I1122 04:10:17.271701 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bc5d1c0d-e8ff-4b93-be90-48a68ea750b9-kube-api-access\") pod \"bc5d1c0d-e8ff-4b93-be90-48a68ea750b9\" (UID: \"bc5d1c0d-e8ff-4b93-be90-48a68ea750b9\") " Nov 22 04:10:17 crc kubenswrapper[4699]: I1122 04:10:17.271724 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bc5d1c0d-e8ff-4b93-be90-48a68ea750b9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "bc5d1c0d-e8ff-4b93-be90-48a68ea750b9" (UID: "bc5d1c0d-e8ff-4b93-be90-48a68ea750b9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:10:17 crc kubenswrapper[4699]: I1122 04:10:17.272113 4699 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc5d1c0d-e8ff-4b93-be90-48a68ea750b9-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 22 04:10:17 crc kubenswrapper[4699]: I1122 04:10:17.279794 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5d1c0d-e8ff-4b93-be90-48a68ea750b9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "bc5d1c0d-e8ff-4b93-be90-48a68ea750b9" (UID: "bc5d1c0d-e8ff-4b93-be90-48a68ea750b9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:10:17 crc kubenswrapper[4699]: I1122 04:10:17.305583 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-c655t" Nov 22 04:10:17 crc kubenswrapper[4699]: I1122 04:10:17.311825 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-c655t" Nov 22 04:10:17 crc kubenswrapper[4699]: I1122 04:10:17.373016 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bc5d1c0d-e8ff-4b93-be90-48a68ea750b9-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 22 04:10:17 crc kubenswrapper[4699]: I1122 04:10:17.732736 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-4s69k" Nov 22 04:10:17 crc kubenswrapper[4699]: I1122 04:10:17.825850 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"bc5d1c0d-e8ff-4b93-be90-48a68ea750b9","Type":"ContainerDied","Data":"bd655e6dadf965e623be44b94c6fc3b2927c85ee2e97ce1c163626f1818cfb11"} Nov 22 04:10:17 crc kubenswrapper[4699]: I1122 04:10:17.826042 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 04:10:17 crc kubenswrapper[4699]: I1122 04:10:17.826830 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd655e6dadf965e623be44b94c6fc3b2927c85ee2e97ce1c163626f1818cfb11" Nov 22 04:10:17 crc kubenswrapper[4699]: I1122 04:10:17.981908 4699 patch_prober.go:28] interesting pod/router-default-5444994796-842kh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 04:10:17 crc kubenswrapper[4699]: [-]has-synced failed: reason withheld Nov 22 04:10:17 crc kubenswrapper[4699]: [+]process-running ok Nov 22 04:10:17 crc kubenswrapper[4699]: healthz check failed Nov 22 04:10:17 crc kubenswrapper[4699]: I1122 04:10:17.981989 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-842kh" podUID="40353ee4-6a92-4e39-be6f-b8249f523e36" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 04:10:18 crc kubenswrapper[4699]: I1122 04:10:18.985982 4699 patch_prober.go:28] interesting pod/router-default-5444994796-842kh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 04:10:18 crc kubenswrapper[4699]: [-]has-synced failed: reason withheld Nov 22 04:10:18 crc kubenswrapper[4699]: [+]process-running ok Nov 22 04:10:18 crc kubenswrapper[4699]: healthz check failed Nov 22 04:10:18 crc kubenswrapper[4699]: I1122 04:10:18.987586 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-842kh" podUID="40353ee4-6a92-4e39-be6f-b8249f523e36" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 04:10:19 crc kubenswrapper[4699]: I1122 04:10:19.980221 4699 patch_prober.go:28] interesting pod/router-default-5444994796-842kh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 04:10:19 crc kubenswrapper[4699]: [-]has-synced failed: reason withheld Nov 22 04:10:19 crc kubenswrapper[4699]: [+]process-running ok Nov 22 04:10:19 crc kubenswrapper[4699]: healthz check failed Nov 22 04:10:19 crc kubenswrapper[4699]: I1122 04:10:19.980298 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-842kh" podUID="40353ee4-6a92-4e39-be6f-b8249f523e36" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 04:10:20 crc kubenswrapper[4699]: I1122 04:10:20.980340 4699 patch_prober.go:28] interesting pod/router-default-5444994796-842kh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 04:10:20 crc kubenswrapper[4699]: [-]has-synced failed: reason withheld Nov 22 04:10:20 crc kubenswrapper[4699]: [+]process-running ok Nov 22 04:10:20 crc kubenswrapper[4699]: healthz check failed Nov 22 04:10:20 crc kubenswrapper[4699]: I1122 04:10:20.980422 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-842kh" podUID="40353ee4-6a92-4e39-be6f-b8249f523e36" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 04:10:21 crc kubenswrapper[4699]: I1122 04:10:21.760566 4699 patch_prober.go:28] interesting pod/downloads-7954f5f757-2gxkc container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Nov 22 04:10:21 crc kubenswrapper[4699]: I1122 04:10:21.761250 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-2gxkc" podUID="854975d3-e251-4354-a2c8-84ea7da296a8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Nov 22 04:10:21 crc kubenswrapper[4699]: I1122 04:10:21.760570 4699 patch_prober.go:28] interesting pod/downloads-7954f5f757-2gxkc container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Nov 22 04:10:21 crc kubenswrapper[4699]: I1122 04:10:21.761398 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2gxkc" podUID="854975d3-e251-4354-a2c8-84ea7da296a8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Nov 22 04:10:21 crc kubenswrapper[4699]: I1122 04:10:21.827631 4699 patch_prober.go:28] interesting pod/console-f9d7485db-9sbmb container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.34:8443/health\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Nov 22 04:10:21 crc kubenswrapper[4699]: I1122 04:10:21.827707 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-9sbmb" podUID="c108dbbb-24af-45d9-a01f-cadab889f225" containerName="console" probeResult="failure" output="Get \"https://10.217.0.34:8443/health\": dial tcp 10.217.0.34:8443: connect: connection refused" Nov 22 04:10:21 crc kubenswrapper[4699]: I1122 04:10:21.982409 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-842kh" Nov 22 04:10:21 crc kubenswrapper[4699]: I1122 04:10:21.987748 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-842kh" Nov 22 04:10:28 crc kubenswrapper[4699]: I1122 04:10:28.694012 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 04:10:31 crc kubenswrapper[4699]: I1122 04:10:31.087959 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:10:31 crc kubenswrapper[4699]: I1122 04:10:31.779118 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-2gxkc" Nov 22 04:10:31 crc kubenswrapper[4699]: I1122 04:10:31.838006 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-9sbmb" Nov 22 04:10:31 crc kubenswrapper[4699]: I1122 04:10:31.844245 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-9sbmb" Nov 22 04:10:38 crc kubenswrapper[4699]: I1122 04:10:38.726424 4699 patch_prober.go:28] interesting pod/machine-config-daemon-kjwnt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:10:38 crc kubenswrapper[4699]: I1122 04:10:38.726898 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:10:41 crc kubenswrapper[4699]: I1122 04:10:41.946745 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dpfg2" Nov 22 04:10:47 crc kubenswrapper[4699]: E1122 04:10:47.221196 4699 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Nov 22 04:10:47 crc kubenswrapper[4699]: E1122 04:10:47.221914 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bshn5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-fw85k_openshift-marketplace(dc749a45-3392-4273-bcbd-8b637826a220): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 22 04:10:47 crc kubenswrapper[4699]: E1122 04:10:47.223142 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-fw85k" podUID="dc749a45-3392-4273-bcbd-8b637826a220" Nov 22 04:10:47 crc kubenswrapper[4699]: E1122 04:10:47.518895 4699 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Nov 22 04:10:47 crc kubenswrapper[4699]: E1122 04:10:47.519099 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m94qf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-gj6mp_openshift-marketplace(fb7b6485-6a90-42c3-a713-39a57596ed35): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 22 04:10:47 crc kubenswrapper[4699]: E1122 04:10:47.521584 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-gj6mp" podUID="fb7b6485-6a90-42c3-a713-39a57596ed35" Nov 22 04:10:57 crc kubenswrapper[4699]: E1122 04:10:57.346394 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-gj6mp" podUID="fb7b6485-6a90-42c3-a713-39a57596ed35" Nov 22 04:11:00 crc kubenswrapper[4699]: E1122 04:11:00.004367 4699 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Nov 22 04:11:00 crc kubenswrapper[4699]: E1122 04:11:00.005531 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mwgd9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-zbxdq_openshift-marketplace(7ef45b9d-67cb-4257-869b-b6a643b49313): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 22 04:11:00 crc kubenswrapper[4699]: E1122 04:11:00.008587 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-zbxdq" podUID="7ef45b9d-67cb-4257-869b-b6a643b49313" Nov 22 04:11:01 crc kubenswrapper[4699]: E1122 04:11:01.143041 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zbxdq" podUID="7ef45b9d-67cb-4257-869b-b6a643b49313" Nov 22 04:11:01 crc kubenswrapper[4699]: E1122 04:11:01.571726 4699 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Nov 22 04:11:01 crc kubenswrapper[4699]: E1122 04:11:01.572476 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tn87l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-bnvlx_openshift-marketplace(3394e2db-ddcc-4a0a-94d3-9336fabf5ca5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 22 04:11:01 crc kubenswrapper[4699]: E1122 04:11:01.573718 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-bnvlx" podUID="3394e2db-ddcc-4a0a-94d3-9336fabf5ca5" Nov 22 04:11:02 crc kubenswrapper[4699]: E1122 04:11:02.370340 4699 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Nov 22 04:11:02 crc kubenswrapper[4699]: E1122 04:11:02.370583 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5s6l8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-5qk9l_openshift-marketplace(0ebb453c-017c-43ff-adae-97a1e95903f2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 22 04:11:02 crc kubenswrapper[4699]: E1122 04:11:02.371750 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-5qk9l" podUID="0ebb453c-017c-43ff-adae-97a1e95903f2" Nov 22 04:11:08 crc kubenswrapper[4699]: I1122 04:11:08.726652 4699 patch_prober.go:28] interesting pod/machine-config-daemon-kjwnt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:11:08 crc kubenswrapper[4699]: I1122 04:11:08.727253 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:11:08 crc kubenswrapper[4699]: I1122 04:11:08.727320 4699 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" Nov 22 04:11:08 crc kubenswrapper[4699]: I1122 04:11:08.728090 4699 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"191befb5ec1036276709a4720f3cd8c40d63d14818bed55c5fac998489233619"} pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 04:11:08 crc kubenswrapper[4699]: I1122 04:11:08.728211 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerName="machine-config-daemon" containerID="cri-o://191befb5ec1036276709a4720f3cd8c40d63d14818bed55c5fac998489233619" gracePeriod=600 Nov 22 04:11:11 crc kubenswrapper[4699]: E1122 04:11:11.094779 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-bnvlx" podUID="3394e2db-ddcc-4a0a-94d3-9336fabf5ca5" Nov 22 04:11:11 crc kubenswrapper[4699]: E1122 04:11:11.094993 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-5qk9l" podUID="0ebb453c-017c-43ff-adae-97a1e95903f2" Nov 22 04:11:11 crc kubenswrapper[4699]: I1122 04:11:11.214244 4699 generic.go:334] "Generic (PLEG): container finished" podID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerID="191befb5ec1036276709a4720f3cd8c40d63d14818bed55c5fac998489233619" exitCode=0 Nov 22 04:11:11 crc kubenswrapper[4699]: I1122 04:11:11.214306 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" event={"ID":"41bdbae2-706a-4f84-9f56-5a42aec77762","Type":"ContainerDied","Data":"191befb5ec1036276709a4720f3cd8c40d63d14818bed55c5fac998489233619"} Nov 22 04:11:13 crc kubenswrapper[4699]: E1122 04:11:13.481858 4699 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Nov 22 04:11:13 crc kubenswrapper[4699]: E1122 04:11:13.482491 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xnfnr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-8rqfc_openshift-marketplace(7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 22 04:11:13 crc kubenswrapper[4699]: E1122 04:11:13.483658 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-8rqfc" podUID="7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45" Nov 22 04:11:14 crc kubenswrapper[4699]: E1122 04:11:14.236167 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-8rqfc" podUID="7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45" Nov 22 04:11:15 crc kubenswrapper[4699]: E1122 04:11:15.186050 4699 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Nov 22 04:11:15 crc kubenswrapper[4699]: E1122 04:11:15.186826 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2k2wh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-fd7rc_openshift-marketplace(444eee36-7eda-4b9c-9609-decdc3fb841b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 22 04:11:15 crc kubenswrapper[4699]: E1122 04:11:15.188377 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-fd7rc" podUID="444eee36-7eda-4b9c-9609-decdc3fb841b" Nov 22 04:11:15 crc kubenswrapper[4699]: E1122 04:11:15.242517 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-fd7rc" podUID="444eee36-7eda-4b9c-9609-decdc3fb841b" Nov 22 04:11:15 crc kubenswrapper[4699]: E1122 04:11:15.401559 4699 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Nov 22 04:11:15 crc kubenswrapper[4699]: E1122 04:11:15.401762 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dkkbz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-mwc7g_openshift-marketplace(5e9094ed-d247-4427-86ce-adf048713377): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 22 04:11:15 crc kubenswrapper[4699]: E1122 04:11:15.402965 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-mwc7g" podUID="5e9094ed-d247-4427-86ce-adf048713377" Nov 22 04:11:16 crc kubenswrapper[4699]: I1122 04:11:16.258002 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" event={"ID":"41bdbae2-706a-4f84-9f56-5a42aec77762","Type":"ContainerStarted","Data":"b84dc855d87746ccb34a8ac352c10879b9e75beb43a499be92456accaff795b4"} Nov 22 04:11:16 crc kubenswrapper[4699]: E1122 04:11:16.307667 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-mwc7g" podUID="5e9094ed-d247-4427-86ce-adf048713377" Nov 22 04:11:17 crc kubenswrapper[4699]: I1122 04:11:17.265312 4699 generic.go:334] "Generic (PLEG): container finished" podID="fb7b6485-6a90-42c3-a713-39a57596ed35" containerID="f3b842e2060da427700cb52d64baf4fd7fe663f287830e2b946aeff556320954" exitCode=0 Nov 22 04:11:17 crc kubenswrapper[4699]: I1122 04:11:17.265402 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gj6mp" event={"ID":"fb7b6485-6a90-42c3-a713-39a57596ed35","Type":"ContainerDied","Data":"f3b842e2060da427700cb52d64baf4fd7fe663f287830e2b946aeff556320954"} Nov 22 04:11:17 crc kubenswrapper[4699]: I1122 04:11:17.271823 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zbxdq" event={"ID":"7ef45b9d-67cb-4257-869b-b6a643b49313","Type":"ContainerStarted","Data":"a1303db5d39f4d15e73db45dde168308020e7b49e333e7297993ea72bb1a4382"} Nov 22 04:11:17 crc kubenswrapper[4699]: I1122 04:11:17.274058 4699 generic.go:334] "Generic (PLEG): container finished" podID="dc749a45-3392-4273-bcbd-8b637826a220" containerID="978a225bcf5072be231d7bf6caa34f4cf65605faa13da9a4fa000b945b4ea0d2" exitCode=0 Nov 22 04:11:17 crc kubenswrapper[4699]: I1122 04:11:17.274096 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fw85k" event={"ID":"dc749a45-3392-4273-bcbd-8b637826a220","Type":"ContainerDied","Data":"978a225bcf5072be231d7bf6caa34f4cf65605faa13da9a4fa000b945b4ea0d2"} Nov 22 04:11:18 crc kubenswrapper[4699]: I1122 04:11:18.281237 4699 generic.go:334] "Generic (PLEG): container finished" podID="7ef45b9d-67cb-4257-869b-b6a643b49313" containerID="a1303db5d39f4d15e73db45dde168308020e7b49e333e7297993ea72bb1a4382" exitCode=0 Nov 22 04:11:18 crc kubenswrapper[4699]: I1122 04:11:18.281410 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zbxdq" event={"ID":"7ef45b9d-67cb-4257-869b-b6a643b49313","Type":"ContainerDied","Data":"a1303db5d39f4d15e73db45dde168308020e7b49e333e7297993ea72bb1a4382"} Nov 22 04:11:18 crc kubenswrapper[4699]: I1122 04:11:18.286021 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fw85k" event={"ID":"dc749a45-3392-4273-bcbd-8b637826a220","Type":"ContainerStarted","Data":"e775131c788b4f37bd085c73de0a1c128038b8106e3d45998d8fea5c4e1d0dce"} Nov 22 04:11:18 crc kubenswrapper[4699]: I1122 04:11:18.290890 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gj6mp" event={"ID":"fb7b6485-6a90-42c3-a713-39a57596ed35","Type":"ContainerStarted","Data":"c0f975735452fb36dfda0e3202182008e8075b579ea1ee320f9ad5696a270c5c"} Nov 22 04:11:18 crc kubenswrapper[4699]: I1122 04:11:18.326390 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gj6mp" podStartSLOduration=2.209169279 podStartE2EDuration="1m8.326358977s" podCreationTimestamp="2025-11-22 04:10:10 +0000 UTC" firstStartedPulling="2025-11-22 04:10:11.574175631 +0000 UTC m=+162.916796818" lastFinishedPulling="2025-11-22 04:11:17.691365329 +0000 UTC m=+229.033986516" observedRunningTime="2025-11-22 04:11:18.323491381 +0000 UTC m=+229.666112598" watchObservedRunningTime="2025-11-22 04:11:18.326358977 +0000 UTC m=+229.668980194" Nov 22 04:11:18 crc kubenswrapper[4699]: I1122 04:11:18.342892 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fw85k" podStartSLOduration=2.972489452 podStartE2EDuration="1m9.342867259s" podCreationTimestamp="2025-11-22 04:10:09 +0000 UTC" firstStartedPulling="2025-11-22 04:10:11.439330165 +0000 UTC m=+162.781951342" lastFinishedPulling="2025-11-22 04:11:17.809707962 +0000 UTC m=+229.152329149" observedRunningTime="2025-11-22 04:11:18.341848843 +0000 UTC m=+229.684470030" watchObservedRunningTime="2025-11-22 04:11:18.342867259 +0000 UTC m=+229.685488446" Nov 22 04:11:19 crc kubenswrapper[4699]: I1122 04:11:19.299723 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zbxdq" event={"ID":"7ef45b9d-67cb-4257-869b-b6a643b49313","Type":"ContainerStarted","Data":"247996c1bea46080a18deb2f3f9de58e95fb7646c85ef816db28dad475c95afe"} Nov 22 04:11:20 crc kubenswrapper[4699]: I1122 04:11:20.060586 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fw85k" Nov 22 04:11:20 crc kubenswrapper[4699]: I1122 04:11:20.061364 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fw85k" Nov 22 04:11:20 crc kubenswrapper[4699]: I1122 04:11:20.231348 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fw85k" Nov 22 04:11:20 crc kubenswrapper[4699]: I1122 04:11:20.268358 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zbxdq" podStartSLOduration=4.182366057 podStartE2EDuration="1m10.268331538s" podCreationTimestamp="2025-11-22 04:10:10 +0000 UTC" firstStartedPulling="2025-11-22 04:10:12.693816263 +0000 UTC m=+164.036437450" lastFinishedPulling="2025-11-22 04:11:18.779781724 +0000 UTC m=+230.122402931" observedRunningTime="2025-11-22 04:11:19.322247806 +0000 UTC m=+230.664869003" watchObservedRunningTime="2025-11-22 04:11:20.268331538 +0000 UTC m=+231.610952735" Nov 22 04:11:20 crc kubenswrapper[4699]: I1122 04:11:20.464967 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gj6mp" Nov 22 04:11:20 crc kubenswrapper[4699]: I1122 04:11:20.465041 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gj6mp" Nov 22 04:11:20 crc kubenswrapper[4699]: I1122 04:11:20.516670 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gj6mp" Nov 22 04:11:21 crc kubenswrapper[4699]: I1122 04:11:21.042131 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zbxdq" Nov 22 04:11:21 crc kubenswrapper[4699]: I1122 04:11:21.042215 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zbxdq" Nov 22 04:11:22 crc kubenswrapper[4699]: I1122 04:11:22.091509 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zbxdq" podUID="7ef45b9d-67cb-4257-869b-b6a643b49313" containerName="registry-server" probeResult="failure" output=< Nov 22 04:11:22 crc kubenswrapper[4699]: timeout: failed to connect service ":50051" within 1s Nov 22 04:11:22 crc kubenswrapper[4699]: > Nov 22 04:11:26 crc kubenswrapper[4699]: I1122 04:11:26.342783 4699 generic.go:334] "Generic (PLEG): container finished" podID="3394e2db-ddcc-4a0a-94d3-9336fabf5ca5" containerID="cc61932655e811e4a71f2ca538cf89c1ac271981777e6f886812662c063c8736" exitCode=0 Nov 22 04:11:26 crc kubenswrapper[4699]: I1122 04:11:26.342929 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bnvlx" event={"ID":"3394e2db-ddcc-4a0a-94d3-9336fabf5ca5","Type":"ContainerDied","Data":"cc61932655e811e4a71f2ca538cf89c1ac271981777e6f886812662c063c8736"} Nov 22 04:11:28 crc kubenswrapper[4699]: I1122 04:11:28.360947 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bnvlx" event={"ID":"3394e2db-ddcc-4a0a-94d3-9336fabf5ca5","Type":"ContainerStarted","Data":"7a42cab7bc1394974dbcc29ded0ccbfa29bf3fdc814f504c54baebaeeb7941c4"} Nov 22 04:11:28 crc kubenswrapper[4699]: I1122 04:11:28.363927 4699 generic.go:334] "Generic (PLEG): container finished" podID="7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45" containerID="2344d0be7657c9892e1179b6dac457feb262990de1cd708dc59a5612428062aa" exitCode=0 Nov 22 04:11:28 crc kubenswrapper[4699]: I1122 04:11:28.364037 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rqfc" event={"ID":"7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45","Type":"ContainerDied","Data":"2344d0be7657c9892e1179b6dac457feb262990de1cd708dc59a5612428062aa"} Nov 22 04:11:28 crc kubenswrapper[4699]: I1122 04:11:28.371523 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qk9l" event={"ID":"0ebb453c-017c-43ff-adae-97a1e95903f2","Type":"ContainerStarted","Data":"6efaea8749b24d24a90622f831dddd6c5d4823adedec4cc03d446080d13417b0"} Nov 22 04:11:28 crc kubenswrapper[4699]: I1122 04:11:28.386679 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bnvlx" podStartSLOduration=4.084188858 podStartE2EDuration="1m21.386646011s" podCreationTimestamp="2025-11-22 04:10:07 +0000 UTC" firstStartedPulling="2025-11-22 04:10:10.366198335 +0000 UTC m=+161.708819522" lastFinishedPulling="2025-11-22 04:11:27.668655488 +0000 UTC m=+239.011276675" observedRunningTime="2025-11-22 04:11:28.382950575 +0000 UTC m=+239.725571762" watchObservedRunningTime="2025-11-22 04:11:28.386646011 +0000 UTC m=+239.729267198" Nov 22 04:11:29 crc kubenswrapper[4699]: I1122 04:11:29.379115 4699 generic.go:334] "Generic (PLEG): container finished" podID="0ebb453c-017c-43ff-adae-97a1e95903f2" containerID="6efaea8749b24d24a90622f831dddd6c5d4823adedec4cc03d446080d13417b0" exitCode=0 Nov 22 04:11:29 crc kubenswrapper[4699]: I1122 04:11:29.379189 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qk9l" event={"ID":"0ebb453c-017c-43ff-adae-97a1e95903f2","Type":"ContainerDied","Data":"6efaea8749b24d24a90622f831dddd6c5d4823adedec4cc03d446080d13417b0"} Nov 22 04:11:30 crc kubenswrapper[4699]: I1122 04:11:30.117560 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fw85k" Nov 22 04:11:30 crc kubenswrapper[4699]: I1122 04:11:30.526684 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gj6mp" Nov 22 04:11:31 crc kubenswrapper[4699]: I1122 04:11:31.089079 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zbxdq" Nov 22 04:11:31 crc kubenswrapper[4699]: I1122 04:11:31.145486 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zbxdq" Nov 22 04:11:32 crc kubenswrapper[4699]: I1122 04:11:32.684972 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gj6mp"] Nov 22 04:11:32 crc kubenswrapper[4699]: I1122 04:11:32.686588 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gj6mp" podUID="fb7b6485-6a90-42c3-a713-39a57596ed35" containerName="registry-server" containerID="cri-o://c0f975735452fb36dfda0e3202182008e8075b579ea1ee320f9ad5696a270c5c" gracePeriod=2 Nov 22 04:11:34 crc kubenswrapper[4699]: I1122 04:11:34.417183 4699 generic.go:334] "Generic (PLEG): container finished" podID="fb7b6485-6a90-42c3-a713-39a57596ed35" containerID="c0f975735452fb36dfda0e3202182008e8075b579ea1ee320f9ad5696a270c5c" exitCode=0 Nov 22 04:11:34 crc kubenswrapper[4699]: I1122 04:11:34.417254 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gj6mp" event={"ID":"fb7b6485-6a90-42c3-a713-39a57596ed35","Type":"ContainerDied","Data":"c0f975735452fb36dfda0e3202182008e8075b579ea1ee320f9ad5696a270c5c"} Nov 22 04:11:34 crc kubenswrapper[4699]: I1122 04:11:34.877343 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gj6mp" Nov 22 04:11:35 crc kubenswrapper[4699]: I1122 04:11:35.011260 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m94qf\" (UniqueName: \"kubernetes.io/projected/fb7b6485-6a90-42c3-a713-39a57596ed35-kube-api-access-m94qf\") pod \"fb7b6485-6a90-42c3-a713-39a57596ed35\" (UID: \"fb7b6485-6a90-42c3-a713-39a57596ed35\") " Nov 22 04:11:35 crc kubenswrapper[4699]: I1122 04:11:35.011388 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb7b6485-6a90-42c3-a713-39a57596ed35-utilities\") pod \"fb7b6485-6a90-42c3-a713-39a57596ed35\" (UID: \"fb7b6485-6a90-42c3-a713-39a57596ed35\") " Nov 22 04:11:35 crc kubenswrapper[4699]: I1122 04:11:35.011492 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb7b6485-6a90-42c3-a713-39a57596ed35-catalog-content\") pod \"fb7b6485-6a90-42c3-a713-39a57596ed35\" (UID: \"fb7b6485-6a90-42c3-a713-39a57596ed35\") " Nov 22 04:11:35 crc kubenswrapper[4699]: I1122 04:11:35.012210 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb7b6485-6a90-42c3-a713-39a57596ed35-utilities" (OuterVolumeSpecName: "utilities") pod "fb7b6485-6a90-42c3-a713-39a57596ed35" (UID: "fb7b6485-6a90-42c3-a713-39a57596ed35"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:11:35 crc kubenswrapper[4699]: I1122 04:11:35.018906 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb7b6485-6a90-42c3-a713-39a57596ed35-kube-api-access-m94qf" (OuterVolumeSpecName: "kube-api-access-m94qf") pod "fb7b6485-6a90-42c3-a713-39a57596ed35" (UID: "fb7b6485-6a90-42c3-a713-39a57596ed35"). InnerVolumeSpecName "kube-api-access-m94qf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:11:35 crc kubenswrapper[4699]: I1122 04:11:35.034907 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb7b6485-6a90-42c3-a713-39a57596ed35-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fb7b6485-6a90-42c3-a713-39a57596ed35" (UID: "fb7b6485-6a90-42c3-a713-39a57596ed35"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:11:35 crc kubenswrapper[4699]: I1122 04:11:35.113787 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m94qf\" (UniqueName: \"kubernetes.io/projected/fb7b6485-6a90-42c3-a713-39a57596ed35-kube-api-access-m94qf\") on node \"crc\" DevicePath \"\"" Nov 22 04:11:35 crc kubenswrapper[4699]: I1122 04:11:35.113838 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb7b6485-6a90-42c3-a713-39a57596ed35-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:11:35 crc kubenswrapper[4699]: I1122 04:11:35.113849 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb7b6485-6a90-42c3-a713-39a57596ed35-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:11:35 crc kubenswrapper[4699]: I1122 04:11:35.426393 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gj6mp" event={"ID":"fb7b6485-6a90-42c3-a713-39a57596ed35","Type":"ContainerDied","Data":"2dd0c4130eeaef6a14369a2fa056bc85ed60d55cbfac37ec73dbfeca734b9846"} Nov 22 04:11:35 crc kubenswrapper[4699]: I1122 04:11:35.426485 4699 scope.go:117] "RemoveContainer" containerID="c0f975735452fb36dfda0e3202182008e8075b579ea1ee320f9ad5696a270c5c" Nov 22 04:11:35 crc kubenswrapper[4699]: I1122 04:11:35.426658 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gj6mp" Nov 22 04:11:35 crc kubenswrapper[4699]: I1122 04:11:35.462176 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gj6mp"] Nov 22 04:11:35 crc kubenswrapper[4699]: I1122 04:11:35.463111 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gj6mp"] Nov 22 04:11:36 crc kubenswrapper[4699]: I1122 04:11:36.590065 4699 scope.go:117] "RemoveContainer" containerID="f3b842e2060da427700cb52d64baf4fd7fe663f287830e2b946aeff556320954" Nov 22 04:11:36 crc kubenswrapper[4699]: I1122 04:11:36.625944 4699 scope.go:117] "RemoveContainer" containerID="88aa39ca2220dc8553262350e12c740c1b3132dc34967741446adf943d6188f5" Nov 22 04:11:37 crc kubenswrapper[4699]: I1122 04:11:37.442462 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rqfc" event={"ID":"7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45","Type":"ContainerStarted","Data":"cacc1bd0bbbf1a0a12481f9eafa810e47aac6bda3656196d1833c6287af7b216"} Nov 22 04:11:37 crc kubenswrapper[4699]: I1122 04:11:37.457614 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb7b6485-6a90-42c3-a713-39a57596ed35" path="/var/lib/kubelet/pods/fb7b6485-6a90-42c3-a713-39a57596ed35/volumes" Nov 22 04:11:37 crc kubenswrapper[4699]: I1122 04:11:37.459726 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qk9l" event={"ID":"0ebb453c-017c-43ff-adae-97a1e95903f2","Type":"ContainerStarted","Data":"7eb8a0cdf3373d7257739802c4c1a4df7f2a9caa5dbc392d9e456e7f8dca8042"} Nov 22 04:11:37 crc kubenswrapper[4699]: I1122 04:11:37.465177 4699 generic.go:334] "Generic (PLEG): container finished" podID="444eee36-7eda-4b9c-9609-decdc3fb841b" containerID="32ca3c3e137d0590c62134e67ec1fd779109848422b0933481b5ec46e9c63113" exitCode=0 Nov 22 04:11:37 crc kubenswrapper[4699]: I1122 04:11:37.465264 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fd7rc" event={"ID":"444eee36-7eda-4b9c-9609-decdc3fb841b","Type":"ContainerDied","Data":"32ca3c3e137d0590c62134e67ec1fd779109848422b0933481b5ec46e9c63113"} Nov 22 04:11:37 crc kubenswrapper[4699]: I1122 04:11:37.474458 4699 generic.go:334] "Generic (PLEG): container finished" podID="5e9094ed-d247-4427-86ce-adf048713377" containerID="e04b14c8976e9ca7dc7eb9873191b725273be7ed161c2e3e738c6a9d8aa97cf7" exitCode=0 Nov 22 04:11:37 crc kubenswrapper[4699]: I1122 04:11:37.474539 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwc7g" event={"ID":"5e9094ed-d247-4427-86ce-adf048713377","Type":"ContainerDied","Data":"e04b14c8976e9ca7dc7eb9873191b725273be7ed161c2e3e738c6a9d8aa97cf7"} Nov 22 04:11:37 crc kubenswrapper[4699]: I1122 04:11:37.493942 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8rqfc" podStartSLOduration=3.188408708 podStartE2EDuration="1m29.493917102s" podCreationTimestamp="2025-11-22 04:10:08 +0000 UTC" firstStartedPulling="2025-11-22 04:10:10.385760076 +0000 UTC m=+161.728381263" lastFinishedPulling="2025-11-22 04:11:36.69126847 +0000 UTC m=+248.033889657" observedRunningTime="2025-11-22 04:11:37.492962147 +0000 UTC m=+248.835583354" watchObservedRunningTime="2025-11-22 04:11:37.493917102 +0000 UTC m=+248.836538279" Nov 22 04:11:37 crc kubenswrapper[4699]: I1122 04:11:37.538345 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5qk9l" podStartSLOduration=3.6125155810000003 podStartE2EDuration="1m26.538322547s" podCreationTimestamp="2025-11-22 04:10:11 +0000 UTC" firstStartedPulling="2025-11-22 04:10:13.705646685 +0000 UTC m=+165.048267872" lastFinishedPulling="2025-11-22 04:11:36.631453651 +0000 UTC m=+247.974074838" observedRunningTime="2025-11-22 04:11:37.537113465 +0000 UTC m=+248.879734652" watchObservedRunningTime="2025-11-22 04:11:37.538322547 +0000 UTC m=+248.880943734" Nov 22 04:11:37 crc kubenswrapper[4699]: I1122 04:11:37.814002 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bnvlx" Nov 22 04:11:37 crc kubenswrapper[4699]: I1122 04:11:37.814086 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bnvlx" Nov 22 04:11:37 crc kubenswrapper[4699]: I1122 04:11:37.867339 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bnvlx" Nov 22 04:11:38 crc kubenswrapper[4699]: I1122 04:11:38.490113 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fd7rc" event={"ID":"444eee36-7eda-4b9c-9609-decdc3fb841b","Type":"ContainerStarted","Data":"7637b3d770f0eb15b3b4e3bc8bcb69819ea8aa9c7df230305d0c93fd77eea4ab"} Nov 22 04:11:38 crc kubenswrapper[4699]: I1122 04:11:38.494247 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwc7g" event={"ID":"5e9094ed-d247-4427-86ce-adf048713377","Type":"ContainerStarted","Data":"5798dd212fea62a966b56c5515d17e786d85cb93df20ca5385f0b795080c454f"} Nov 22 04:11:38 crc kubenswrapper[4699]: I1122 04:11:38.530409 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fd7rc" podStartSLOduration=3.877187254 podStartE2EDuration="1m31.530376694s" podCreationTimestamp="2025-11-22 04:10:07 +0000 UTC" firstStartedPulling="2025-11-22 04:10:10.277868013 +0000 UTC m=+161.620489200" lastFinishedPulling="2025-11-22 04:11:37.931057453 +0000 UTC m=+249.273678640" observedRunningTime="2025-11-22 04:11:38.507103814 +0000 UTC m=+249.849725001" watchObservedRunningTime="2025-11-22 04:11:38.530376694 +0000 UTC m=+249.872997881" Nov 22 04:11:38 crc kubenswrapper[4699]: I1122 04:11:38.535678 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mwc7g" podStartSLOduration=3.933720196 podStartE2EDuration="1m31.535669483s" podCreationTimestamp="2025-11-22 04:10:07 +0000 UTC" firstStartedPulling="2025-11-22 04:10:10.315357025 +0000 UTC m=+161.657978212" lastFinishedPulling="2025-11-22 04:11:37.917306312 +0000 UTC m=+249.259927499" observedRunningTime="2025-11-22 04:11:38.528963097 +0000 UTC m=+249.871584284" watchObservedRunningTime="2025-11-22 04:11:38.535669483 +0000 UTC m=+249.878290670" Nov 22 04:11:38 crc kubenswrapper[4699]: I1122 04:11:38.552083 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bnvlx" Nov 22 04:11:38 crc kubenswrapper[4699]: I1122 04:11:38.661518 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8rqfc" Nov 22 04:11:38 crc kubenswrapper[4699]: I1122 04:11:38.661656 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8rqfc" Nov 22 04:11:38 crc kubenswrapper[4699]: I1122 04:11:38.700515 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8rqfc" Nov 22 04:11:41 crc kubenswrapper[4699]: I1122 04:11:41.425754 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5qk9l" Nov 22 04:11:41 crc kubenswrapper[4699]: I1122 04:11:41.426671 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5qk9l" Nov 22 04:11:42 crc kubenswrapper[4699]: I1122 04:11:42.473833 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5qk9l" podUID="0ebb453c-017c-43ff-adae-97a1e95903f2" containerName="registry-server" probeResult="failure" output=< Nov 22 04:11:42 crc kubenswrapper[4699]: timeout: failed to connect service ":50051" within 1s Nov 22 04:11:42 crc kubenswrapper[4699]: > Nov 22 04:11:47 crc kubenswrapper[4699]: I1122 04:11:47.953169 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mwc7g" Nov 22 04:11:47 crc kubenswrapper[4699]: I1122 04:11:47.953928 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mwc7g" Nov 22 04:11:48 crc kubenswrapper[4699]: I1122 04:11:48.001176 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mwc7g" Nov 22 04:11:48 crc kubenswrapper[4699]: I1122 04:11:48.222564 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fd7rc" Nov 22 04:11:48 crc kubenswrapper[4699]: I1122 04:11:48.222639 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fd7rc" Nov 22 04:11:48 crc kubenswrapper[4699]: I1122 04:11:48.265707 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fd7rc" Nov 22 04:11:48 crc kubenswrapper[4699]: I1122 04:11:48.590475 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fd7rc" Nov 22 04:11:48 crc kubenswrapper[4699]: I1122 04:11:48.595467 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mwc7g" Nov 22 04:11:48 crc kubenswrapper[4699]: I1122 04:11:48.698604 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8rqfc" Nov 22 04:11:49 crc kubenswrapper[4699]: I1122 04:11:49.633690 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fd7rc"] Nov 22 04:11:50 crc kubenswrapper[4699]: I1122 04:11:50.562643 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fd7rc" podUID="444eee36-7eda-4b9c-9609-decdc3fb841b" containerName="registry-server" containerID="cri-o://7637b3d770f0eb15b3b4e3bc8bcb69819ea8aa9c7df230305d0c93fd77eea4ab" gracePeriod=2 Nov 22 04:11:50 crc kubenswrapper[4699]: I1122 04:11:50.615076 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8rqfc"] Nov 22 04:11:50 crc kubenswrapper[4699]: I1122 04:11:50.615867 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8rqfc" podUID="7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45" containerName="registry-server" containerID="cri-o://cacc1bd0bbbf1a0a12481f9eafa810e47aac6bda3656196d1833c6287af7b216" gracePeriod=30 Nov 22 04:11:50 crc kubenswrapper[4699]: I1122 04:11:50.627307 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mwc7g"] Nov 22 04:11:50 crc kubenswrapper[4699]: I1122 04:11:50.627625 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mwc7g" podUID="5e9094ed-d247-4427-86ce-adf048713377" containerName="registry-server" containerID="cri-o://5798dd212fea62a966b56c5515d17e786d85cb93df20ca5385f0b795080c454f" gracePeriod=30 Nov 22 04:11:50 crc kubenswrapper[4699]: I1122 04:11:50.634602 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bnvlx"] Nov 22 04:11:50 crc kubenswrapper[4699]: I1122 04:11:50.635030 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bnvlx" podUID="3394e2db-ddcc-4a0a-94d3-9336fabf5ca5" containerName="registry-server" containerID="cri-o://7a42cab7bc1394974dbcc29ded0ccbfa29bf3fdc814f504c54baebaeeb7941c4" gracePeriod=30 Nov 22 04:11:50 crc kubenswrapper[4699]: I1122 04:11:50.640936 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4dvpp"] Nov 22 04:11:50 crc kubenswrapper[4699]: I1122 04:11:50.641240 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-4dvpp" podUID="3fca2cfb-d582-4dbb-ab4c-199316fce981" containerName="marketplace-operator" containerID="cri-o://51574152e38ba657d6b627db4abadc80e9a9c01d5555c833d66f75dee930cfbc" gracePeriod=30 Nov 22 04:11:50 crc kubenswrapper[4699]: I1122 04:11:50.660573 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fw85k"] Nov 22 04:11:50 crc kubenswrapper[4699]: I1122 04:11:50.660880 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fw85k" podUID="dc749a45-3392-4273-bcbd-8b637826a220" containerName="registry-server" containerID="cri-o://e775131c788b4f37bd085c73de0a1c128038b8106e3d45998d8fea5c4e1d0dce" gracePeriod=30 Nov 22 04:11:50 crc kubenswrapper[4699]: I1122 04:11:50.668231 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5qk9l"] Nov 22 04:11:50 crc kubenswrapper[4699]: I1122 04:11:50.668541 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5qk9l" podUID="0ebb453c-017c-43ff-adae-97a1e95903f2" containerName="registry-server" containerID="cri-o://7eb8a0cdf3373d7257739802c4c1a4df7f2a9caa5dbc392d9e456e7f8dca8042" gracePeriod=30 Nov 22 04:11:50 crc kubenswrapper[4699]: I1122 04:11:50.676544 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8pl4v"] Nov 22 04:11:50 crc kubenswrapper[4699]: E1122 04:11:50.676921 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb7b6485-6a90-42c3-a713-39a57596ed35" containerName="extract-content" Nov 22 04:11:50 crc kubenswrapper[4699]: I1122 04:11:50.676944 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb7b6485-6a90-42c3-a713-39a57596ed35" containerName="extract-content" Nov 22 04:11:50 crc kubenswrapper[4699]: E1122 04:11:50.676957 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb7b6485-6a90-42c3-a713-39a57596ed35" containerName="extract-utilities" Nov 22 04:11:50 crc kubenswrapper[4699]: I1122 04:11:50.676965 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb7b6485-6a90-42c3-a713-39a57596ed35" containerName="extract-utilities" Nov 22 04:11:50 crc kubenswrapper[4699]: E1122 04:11:50.676980 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb7b6485-6a90-42c3-a713-39a57596ed35" containerName="registry-server" Nov 22 04:11:50 crc kubenswrapper[4699]: I1122 04:11:50.676985 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb7b6485-6a90-42c3-a713-39a57596ed35" containerName="registry-server" Nov 22 04:11:50 crc kubenswrapper[4699]: E1122 04:11:50.676994 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc5d1c0d-e8ff-4b93-be90-48a68ea750b9" containerName="pruner" Nov 22 04:11:50 crc kubenswrapper[4699]: I1122 04:11:50.676999 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc5d1c0d-e8ff-4b93-be90-48a68ea750b9" containerName="pruner" Nov 22 04:11:50 crc kubenswrapper[4699]: E1122 04:11:50.677010 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="214d593e-dfbe-43d1-9108-d98f42ac104d" containerName="pruner" Nov 22 04:11:50 crc kubenswrapper[4699]: I1122 04:11:50.677015 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="214d593e-dfbe-43d1-9108-d98f42ac104d" containerName="pruner" Nov 22 04:11:50 crc kubenswrapper[4699]: E1122 04:11:50.677024 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8098591-7b9f-4330-90f0-4181570d05b3" containerName="collect-profiles" Nov 22 04:11:50 crc kubenswrapper[4699]: I1122 04:11:50.677031 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8098591-7b9f-4330-90f0-4181570d05b3" containerName="collect-profiles" Nov 22 04:11:50 crc kubenswrapper[4699]: I1122 04:11:50.677154 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="214d593e-dfbe-43d1-9108-d98f42ac104d" containerName="pruner" Nov 22 04:11:50 crc kubenswrapper[4699]: I1122 04:11:50.677164 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb7b6485-6a90-42c3-a713-39a57596ed35" containerName="registry-server" Nov 22 04:11:50 crc kubenswrapper[4699]: I1122 04:11:50.677176 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8098591-7b9f-4330-90f0-4181570d05b3" containerName="collect-profiles" Nov 22 04:11:50 crc kubenswrapper[4699]: I1122 04:11:50.677182 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc5d1c0d-e8ff-4b93-be90-48a68ea750b9" containerName="pruner" Nov 22 04:11:50 crc kubenswrapper[4699]: I1122 04:11:50.677749 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8pl4v" Nov 22 04:11:50 crc kubenswrapper[4699]: I1122 04:11:50.678060 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zbxdq"] Nov 22 04:11:50 crc kubenswrapper[4699]: I1122 04:11:50.678409 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zbxdq" podUID="7ef45b9d-67cb-4257-869b-b6a643b49313" containerName="registry-server" containerID="cri-o://247996c1bea46080a18deb2f3f9de58e95fb7646c85ef816db28dad475c95afe" gracePeriod=30 Nov 22 04:11:50 crc kubenswrapper[4699]: I1122 04:11:50.689117 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8pl4v"] Nov 22 04:11:50 crc kubenswrapper[4699]: I1122 04:11:50.852646 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjtrh\" (UniqueName: \"kubernetes.io/projected/cfa86e4d-ee3e-4839-af4e-966184a73dc9-kube-api-access-wjtrh\") pod \"marketplace-operator-79b997595-8pl4v\" (UID: \"cfa86e4d-ee3e-4839-af4e-966184a73dc9\") " pod="openshift-marketplace/marketplace-operator-79b997595-8pl4v" Nov 22 04:11:50 crc kubenswrapper[4699]: I1122 04:11:50.852871 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cfa86e4d-ee3e-4839-af4e-966184a73dc9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8pl4v\" (UID: \"cfa86e4d-ee3e-4839-af4e-966184a73dc9\") " pod="openshift-marketplace/marketplace-operator-79b997595-8pl4v" Nov 22 04:11:50 crc kubenswrapper[4699]: I1122 04:11:50.852943 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cfa86e4d-ee3e-4839-af4e-966184a73dc9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8pl4v\" (UID: \"cfa86e4d-ee3e-4839-af4e-966184a73dc9\") " pod="openshift-marketplace/marketplace-operator-79b997595-8pl4v" Nov 22 04:11:50 crc kubenswrapper[4699]: I1122 04:11:50.938951 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-qc8mt"] Nov 22 04:11:50 crc kubenswrapper[4699]: I1122 04:11:50.954136 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cfa86e4d-ee3e-4839-af4e-966184a73dc9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8pl4v\" (UID: \"cfa86e4d-ee3e-4839-af4e-966184a73dc9\") " pod="openshift-marketplace/marketplace-operator-79b997595-8pl4v" Nov 22 04:11:50 crc kubenswrapper[4699]: I1122 04:11:50.954188 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cfa86e4d-ee3e-4839-af4e-966184a73dc9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8pl4v\" (UID: \"cfa86e4d-ee3e-4839-af4e-966184a73dc9\") " pod="openshift-marketplace/marketplace-operator-79b997595-8pl4v" Nov 22 04:11:50 crc kubenswrapper[4699]: I1122 04:11:50.954232 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjtrh\" (UniqueName: \"kubernetes.io/projected/cfa86e4d-ee3e-4839-af4e-966184a73dc9-kube-api-access-wjtrh\") pod \"marketplace-operator-79b997595-8pl4v\" (UID: \"cfa86e4d-ee3e-4839-af4e-966184a73dc9\") " pod="openshift-marketplace/marketplace-operator-79b997595-8pl4v" Nov 22 04:11:50 crc kubenswrapper[4699]: I1122 04:11:50.955485 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cfa86e4d-ee3e-4839-af4e-966184a73dc9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8pl4v\" (UID: \"cfa86e4d-ee3e-4839-af4e-966184a73dc9\") " pod="openshift-marketplace/marketplace-operator-79b997595-8pl4v" Nov 22 04:11:50 crc kubenswrapper[4699]: I1122 04:11:50.966273 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cfa86e4d-ee3e-4839-af4e-966184a73dc9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8pl4v\" (UID: \"cfa86e4d-ee3e-4839-af4e-966184a73dc9\") " pod="openshift-marketplace/marketplace-operator-79b997595-8pl4v" Nov 22 04:11:50 crc kubenswrapper[4699]: I1122 04:11:50.976028 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjtrh\" (UniqueName: \"kubernetes.io/projected/cfa86e4d-ee3e-4839-af4e-966184a73dc9-kube-api-access-wjtrh\") pod \"marketplace-operator-79b997595-8pl4v\" (UID: \"cfa86e4d-ee3e-4839-af4e-966184a73dc9\") " pod="openshift-marketplace/marketplace-operator-79b997595-8pl4v" Nov 22 04:11:50 crc kubenswrapper[4699]: I1122 04:11:50.995109 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8pl4v" Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.036928 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8rqfc"] Nov 22 04:11:51 crc kubenswrapper[4699]: E1122 04:11:51.042275 4699 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 247996c1bea46080a18deb2f3f9de58e95fb7646c85ef816db28dad475c95afe is running failed: container process not found" containerID="247996c1bea46080a18deb2f3f9de58e95fb7646c85ef816db28dad475c95afe" cmd=["grpc_health_probe","-addr=:50051"] Nov 22 04:11:51 crc kubenswrapper[4699]: E1122 04:11:51.042879 4699 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 247996c1bea46080a18deb2f3f9de58e95fb7646c85ef816db28dad475c95afe is running failed: container process not found" containerID="247996c1bea46080a18deb2f3f9de58e95fb7646c85ef816db28dad475c95afe" cmd=["grpc_health_probe","-addr=:50051"] Nov 22 04:11:51 crc kubenswrapper[4699]: E1122 04:11:51.043681 4699 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 247996c1bea46080a18deb2f3f9de58e95fb7646c85ef816db28dad475c95afe is running failed: container process not found" containerID="247996c1bea46080a18deb2f3f9de58e95fb7646c85ef816db28dad475c95afe" cmd=["grpc_health_probe","-addr=:50051"] Nov 22 04:11:51 crc kubenswrapper[4699]: E1122 04:11:51.043730 4699 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 247996c1bea46080a18deb2f3f9de58e95fb7646c85ef816db28dad475c95afe is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-zbxdq" podUID="7ef45b9d-67cb-4257-869b-b6a643b49313" containerName="registry-server" Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.483273 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8pl4v"] Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.570865 4699 generic.go:334] "Generic (PLEG): container finished" podID="0ebb453c-017c-43ff-adae-97a1e95903f2" containerID="7eb8a0cdf3373d7257739802c4c1a4df7f2a9caa5dbc392d9e456e7f8dca8042" exitCode=0 Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.570936 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qk9l" event={"ID":"0ebb453c-017c-43ff-adae-97a1e95903f2","Type":"ContainerDied","Data":"7eb8a0cdf3373d7257739802c4c1a4df7f2a9caa5dbc392d9e456e7f8dca8042"} Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.572189 4699 generic.go:334] "Generic (PLEG): container finished" podID="3fca2cfb-d582-4dbb-ab4c-199316fce981" containerID="51574152e38ba657d6b627db4abadc80e9a9c01d5555c833d66f75dee930cfbc" exitCode=0 Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.572229 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4dvpp" event={"ID":"3fca2cfb-d582-4dbb-ab4c-199316fce981","Type":"ContainerDied","Data":"51574152e38ba657d6b627db4abadc80e9a9c01d5555c833d66f75dee930cfbc"} Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.574561 4699 generic.go:334] "Generic (PLEG): container finished" podID="5e9094ed-d247-4427-86ce-adf048713377" containerID="5798dd212fea62a966b56c5515d17e786d85cb93df20ca5385f0b795080c454f" exitCode=0 Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.574653 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwc7g" event={"ID":"5e9094ed-d247-4427-86ce-adf048713377","Type":"ContainerDied","Data":"5798dd212fea62a966b56c5515d17e786d85cb93df20ca5385f0b795080c454f"} Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.581425 4699 generic.go:334] "Generic (PLEG): container finished" podID="7ef45b9d-67cb-4257-869b-b6a643b49313" containerID="247996c1bea46080a18deb2f3f9de58e95fb7646c85ef816db28dad475c95afe" exitCode=0 Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.581518 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zbxdq" event={"ID":"7ef45b9d-67cb-4257-869b-b6a643b49313","Type":"ContainerDied","Data":"247996c1bea46080a18deb2f3f9de58e95fb7646c85ef816db28dad475c95afe"} Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.582656 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8pl4v" event={"ID":"cfa86e4d-ee3e-4839-af4e-966184a73dc9","Type":"ContainerStarted","Data":"9c7599e12ec40e8dcbd796aa0f1abf725aea37dc8b47e3e13fcd511154dc4692"} Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.584564 4699 generic.go:334] "Generic (PLEG): container finished" podID="dc749a45-3392-4273-bcbd-8b637826a220" containerID="e775131c788b4f37bd085c73de0a1c128038b8106e3d45998d8fea5c4e1d0dce" exitCode=0 Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.584617 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fw85k" event={"ID":"dc749a45-3392-4273-bcbd-8b637826a220","Type":"ContainerDied","Data":"e775131c788b4f37bd085c73de0a1c128038b8106e3d45998d8fea5c4e1d0dce"} Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.586341 4699 generic.go:334] "Generic (PLEG): container finished" podID="3394e2db-ddcc-4a0a-94d3-9336fabf5ca5" containerID="7a42cab7bc1394974dbcc29ded0ccbfa29bf3fdc814f504c54baebaeeb7941c4" exitCode=0 Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.586387 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bnvlx" event={"ID":"3394e2db-ddcc-4a0a-94d3-9336fabf5ca5","Type":"ContainerDied","Data":"7a42cab7bc1394974dbcc29ded0ccbfa29bf3fdc814f504c54baebaeeb7941c4"} Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.588154 4699 generic.go:334] "Generic (PLEG): container finished" podID="7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45" containerID="cacc1bd0bbbf1a0a12481f9eafa810e47aac6bda3656196d1833c6287af7b216" exitCode=0 Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.588242 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rqfc" event={"ID":"7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45","Type":"ContainerDied","Data":"cacc1bd0bbbf1a0a12481f9eafa810e47aac6bda3656196d1833c6287af7b216"} Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.590018 4699 generic.go:334] "Generic (PLEG): container finished" podID="444eee36-7eda-4b9c-9609-decdc3fb841b" containerID="7637b3d770f0eb15b3b4e3bc8bcb69819ea8aa9c7df230305d0c93fd77eea4ab" exitCode=0 Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.590052 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fd7rc" event={"ID":"444eee36-7eda-4b9c-9609-decdc3fb841b","Type":"ContainerDied","Data":"7637b3d770f0eb15b3b4e3bc8bcb69819ea8aa9c7df230305d0c93fd77eea4ab"} Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.629576 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4dvpp" Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.727011 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mwc7g" Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.773670 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3fca2cfb-d582-4dbb-ab4c-199316fce981-marketplace-operator-metrics\") pod \"3fca2cfb-d582-4dbb-ab4c-199316fce981\" (UID: \"3fca2cfb-d582-4dbb-ab4c-199316fce981\") " Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.773739 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lk7bc\" (UniqueName: \"kubernetes.io/projected/3fca2cfb-d582-4dbb-ab4c-199316fce981-kube-api-access-lk7bc\") pod \"3fca2cfb-d582-4dbb-ab4c-199316fce981\" (UID: \"3fca2cfb-d582-4dbb-ab4c-199316fce981\") " Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.773901 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3fca2cfb-d582-4dbb-ab4c-199316fce981-marketplace-trusted-ca\") pod \"3fca2cfb-d582-4dbb-ab4c-199316fce981\" (UID: \"3fca2cfb-d582-4dbb-ab4c-199316fce981\") " Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.774997 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fca2cfb-d582-4dbb-ab4c-199316fce981-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "3fca2cfb-d582-4dbb-ab4c-199316fce981" (UID: "3fca2cfb-d582-4dbb-ab4c-199316fce981"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.784093 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fca2cfb-d582-4dbb-ab4c-199316fce981-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "3fca2cfb-d582-4dbb-ab4c-199316fce981" (UID: "3fca2cfb-d582-4dbb-ab4c-199316fce981"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.785129 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fca2cfb-d582-4dbb-ab4c-199316fce981-kube-api-access-lk7bc" (OuterVolumeSpecName: "kube-api-access-lk7bc") pod "3fca2cfb-d582-4dbb-ab4c-199316fce981" (UID: "3fca2cfb-d582-4dbb-ab4c-199316fce981"). InnerVolumeSpecName "kube-api-access-lk7bc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.875296 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e9094ed-d247-4427-86ce-adf048713377-utilities\") pod \"5e9094ed-d247-4427-86ce-adf048713377\" (UID: \"5e9094ed-d247-4427-86ce-adf048713377\") " Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.875351 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e9094ed-d247-4427-86ce-adf048713377-catalog-content\") pod \"5e9094ed-d247-4427-86ce-adf048713377\" (UID: \"5e9094ed-d247-4427-86ce-adf048713377\") " Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.875386 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkkbz\" (UniqueName: \"kubernetes.io/projected/5e9094ed-d247-4427-86ce-adf048713377-kube-api-access-dkkbz\") pod \"5e9094ed-d247-4427-86ce-adf048713377\" (UID: \"5e9094ed-d247-4427-86ce-adf048713377\") " Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.875682 4699 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3fca2cfb-d582-4dbb-ab4c-199316fce981-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.875698 4699 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3fca2cfb-d582-4dbb-ab4c-199316fce981-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.875708 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lk7bc\" (UniqueName: \"kubernetes.io/projected/3fca2cfb-d582-4dbb-ab4c-199316fce981-kube-api-access-lk7bc\") on node \"crc\" DevicePath \"\"" Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.876492 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e9094ed-d247-4427-86ce-adf048713377-utilities" (OuterVolumeSpecName: "utilities") pod "5e9094ed-d247-4427-86ce-adf048713377" (UID: "5e9094ed-d247-4427-86ce-adf048713377"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.879271 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e9094ed-d247-4427-86ce-adf048713377-kube-api-access-dkkbz" (OuterVolumeSpecName: "kube-api-access-dkkbz") pod "5e9094ed-d247-4427-86ce-adf048713377" (UID: "5e9094ed-d247-4427-86ce-adf048713377"). InnerVolumeSpecName "kube-api-access-dkkbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.880453 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fd7rc" Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.884878 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8rqfc" Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.891046 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5qk9l" Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.906634 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bnvlx" Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.927448 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fw85k" Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.928935 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e9094ed-d247-4427-86ce-adf048713377-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5e9094ed-d247-4427-86ce-adf048713377" (UID: "5e9094ed-d247-4427-86ce-adf048713377"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.939868 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zbxdq" Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.976756 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnfnr\" (UniqueName: \"kubernetes.io/projected/7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45-kube-api-access-xnfnr\") pod \"7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45\" (UID: \"7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45\") " Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.976809 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45-catalog-content\") pod \"7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45\" (UID: \"7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45\") " Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.976830 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5s6l8\" (UniqueName: \"kubernetes.io/projected/0ebb453c-017c-43ff-adae-97a1e95903f2-kube-api-access-5s6l8\") pod \"0ebb453c-017c-43ff-adae-97a1e95903f2\" (UID: \"0ebb453c-017c-43ff-adae-97a1e95903f2\") " Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.976875 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/444eee36-7eda-4b9c-9609-decdc3fb841b-utilities\") pod \"444eee36-7eda-4b9c-9609-decdc3fb841b\" (UID: \"444eee36-7eda-4b9c-9609-decdc3fb841b\") " Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.976917 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ebb453c-017c-43ff-adae-97a1e95903f2-catalog-content\") pod \"0ebb453c-017c-43ff-adae-97a1e95903f2\" (UID: \"0ebb453c-017c-43ff-adae-97a1e95903f2\") " Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.976984 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ebb453c-017c-43ff-adae-97a1e95903f2-utilities\") pod \"0ebb453c-017c-43ff-adae-97a1e95903f2\" (UID: \"0ebb453c-017c-43ff-adae-97a1e95903f2\") " Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.977016 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45-utilities\") pod \"7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45\" (UID: \"7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45\") " Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.977060 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/444eee36-7eda-4b9c-9609-decdc3fb841b-catalog-content\") pod \"444eee36-7eda-4b9c-9609-decdc3fb841b\" (UID: \"444eee36-7eda-4b9c-9609-decdc3fb841b\") " Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.977084 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2k2wh\" (UniqueName: \"kubernetes.io/projected/444eee36-7eda-4b9c-9609-decdc3fb841b-kube-api-access-2k2wh\") pod \"444eee36-7eda-4b9c-9609-decdc3fb841b\" (UID: \"444eee36-7eda-4b9c-9609-decdc3fb841b\") " Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.977393 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e9094ed-d247-4427-86ce-adf048713377-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.977404 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e9094ed-d247-4427-86ce-adf048713377-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.977415 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkkbz\" (UniqueName: \"kubernetes.io/projected/5e9094ed-d247-4427-86ce-adf048713377-kube-api-access-dkkbz\") on node \"crc\" DevicePath \"\"" Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.979506 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45-kube-api-access-xnfnr" (OuterVolumeSpecName: "kube-api-access-xnfnr") pod "7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45" (UID: "7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45"). InnerVolumeSpecName "kube-api-access-xnfnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.979674 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/444eee36-7eda-4b9c-9609-decdc3fb841b-utilities" (OuterVolumeSpecName: "utilities") pod "444eee36-7eda-4b9c-9609-decdc3fb841b" (UID: "444eee36-7eda-4b9c-9609-decdc3fb841b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.979697 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ebb453c-017c-43ff-adae-97a1e95903f2-utilities" (OuterVolumeSpecName: "utilities") pod "0ebb453c-017c-43ff-adae-97a1e95903f2" (UID: "0ebb453c-017c-43ff-adae-97a1e95903f2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.981414 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45-utilities" (OuterVolumeSpecName: "utilities") pod "7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45" (UID: "7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.987053 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ebb453c-017c-43ff-adae-97a1e95903f2-kube-api-access-5s6l8" (OuterVolumeSpecName: "kube-api-access-5s6l8") pod "0ebb453c-017c-43ff-adae-97a1e95903f2" (UID: "0ebb453c-017c-43ff-adae-97a1e95903f2"). InnerVolumeSpecName "kube-api-access-5s6l8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:11:51 crc kubenswrapper[4699]: I1122 04:11:51.991755 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/444eee36-7eda-4b9c-9609-decdc3fb841b-kube-api-access-2k2wh" (OuterVolumeSpecName: "kube-api-access-2k2wh") pod "444eee36-7eda-4b9c-9609-decdc3fb841b" (UID: "444eee36-7eda-4b9c-9609-decdc3fb841b"). InnerVolumeSpecName "kube-api-access-2k2wh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.039165 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45" (UID: "7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.078063 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3394e2db-ddcc-4a0a-94d3-9336fabf5ca5-utilities\") pod \"3394e2db-ddcc-4a0a-94d3-9336fabf5ca5\" (UID: \"3394e2db-ddcc-4a0a-94d3-9336fabf5ca5\") " Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.078124 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc749a45-3392-4273-bcbd-8b637826a220-utilities\") pod \"dc749a45-3392-4273-bcbd-8b637826a220\" (UID: \"dc749a45-3392-4273-bcbd-8b637826a220\") " Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.078165 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ef45b9d-67cb-4257-869b-b6a643b49313-utilities\") pod \"7ef45b9d-67cb-4257-869b-b6a643b49313\" (UID: \"7ef45b9d-67cb-4257-869b-b6a643b49313\") " Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.078238 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwgd9\" (UniqueName: \"kubernetes.io/projected/7ef45b9d-67cb-4257-869b-b6a643b49313-kube-api-access-mwgd9\") pod \"7ef45b9d-67cb-4257-869b-b6a643b49313\" (UID: \"7ef45b9d-67cb-4257-869b-b6a643b49313\") " Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.078272 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc749a45-3392-4273-bcbd-8b637826a220-catalog-content\") pod \"dc749a45-3392-4273-bcbd-8b637826a220\" (UID: \"dc749a45-3392-4273-bcbd-8b637826a220\") " Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.078306 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3394e2db-ddcc-4a0a-94d3-9336fabf5ca5-catalog-content\") pod \"3394e2db-ddcc-4a0a-94d3-9336fabf5ca5\" (UID: \"3394e2db-ddcc-4a0a-94d3-9336fabf5ca5\") " Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.078326 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tn87l\" (UniqueName: \"kubernetes.io/projected/3394e2db-ddcc-4a0a-94d3-9336fabf5ca5-kube-api-access-tn87l\") pod \"3394e2db-ddcc-4a0a-94d3-9336fabf5ca5\" (UID: \"3394e2db-ddcc-4a0a-94d3-9336fabf5ca5\") " Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.078367 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bshn5\" (UniqueName: \"kubernetes.io/projected/dc749a45-3392-4273-bcbd-8b637826a220-kube-api-access-bshn5\") pod \"dc749a45-3392-4273-bcbd-8b637826a220\" (UID: \"dc749a45-3392-4273-bcbd-8b637826a220\") " Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.078418 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ef45b9d-67cb-4257-869b-b6a643b49313-catalog-content\") pod \"7ef45b9d-67cb-4257-869b-b6a643b49313\" (UID: \"7ef45b9d-67cb-4257-869b-b6a643b49313\") " Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.078650 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnfnr\" (UniqueName: \"kubernetes.io/projected/7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45-kube-api-access-xnfnr\") on node \"crc\" DevicePath \"\"" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.078665 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.078675 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5s6l8\" (UniqueName: \"kubernetes.io/projected/0ebb453c-017c-43ff-adae-97a1e95903f2-kube-api-access-5s6l8\") on node \"crc\" DevicePath \"\"" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.078688 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/444eee36-7eda-4b9c-9609-decdc3fb841b-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.078700 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ebb453c-017c-43ff-adae-97a1e95903f2-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.078709 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.078719 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2k2wh\" (UniqueName: \"kubernetes.io/projected/444eee36-7eda-4b9c-9609-decdc3fb841b-kube-api-access-2k2wh\") on node \"crc\" DevicePath \"\"" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.079105 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc749a45-3392-4273-bcbd-8b637826a220-utilities" (OuterVolumeSpecName: "utilities") pod "dc749a45-3392-4273-bcbd-8b637826a220" (UID: "dc749a45-3392-4273-bcbd-8b637826a220"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.079560 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3394e2db-ddcc-4a0a-94d3-9336fabf5ca5-utilities" (OuterVolumeSpecName: "utilities") pod "3394e2db-ddcc-4a0a-94d3-9336fabf5ca5" (UID: "3394e2db-ddcc-4a0a-94d3-9336fabf5ca5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.079721 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ef45b9d-67cb-4257-869b-b6a643b49313-utilities" (OuterVolumeSpecName: "utilities") pod "7ef45b9d-67cb-4257-869b-b6a643b49313" (UID: "7ef45b9d-67cb-4257-869b-b6a643b49313"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.081023 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/444eee36-7eda-4b9c-9609-decdc3fb841b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "444eee36-7eda-4b9c-9609-decdc3fb841b" (UID: "444eee36-7eda-4b9c-9609-decdc3fb841b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.083625 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3394e2db-ddcc-4a0a-94d3-9336fabf5ca5-kube-api-access-tn87l" (OuterVolumeSpecName: "kube-api-access-tn87l") pod "3394e2db-ddcc-4a0a-94d3-9336fabf5ca5" (UID: "3394e2db-ddcc-4a0a-94d3-9336fabf5ca5"). InnerVolumeSpecName "kube-api-access-tn87l". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.083731 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc749a45-3392-4273-bcbd-8b637826a220-kube-api-access-bshn5" (OuterVolumeSpecName: "kube-api-access-bshn5") pod "dc749a45-3392-4273-bcbd-8b637826a220" (UID: "dc749a45-3392-4273-bcbd-8b637826a220"). InnerVolumeSpecName "kube-api-access-bshn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.090920 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ef45b9d-67cb-4257-869b-b6a643b49313-kube-api-access-mwgd9" (OuterVolumeSpecName: "kube-api-access-mwgd9") pod "7ef45b9d-67cb-4257-869b-b6a643b49313" (UID: "7ef45b9d-67cb-4257-869b-b6a643b49313"). InnerVolumeSpecName "kube-api-access-mwgd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.108027 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc749a45-3392-4273-bcbd-8b637826a220-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dc749a45-3392-4273-bcbd-8b637826a220" (UID: "dc749a45-3392-4273-bcbd-8b637826a220"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.154541 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3394e2db-ddcc-4a0a-94d3-9336fabf5ca5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3394e2db-ddcc-4a0a-94d3-9336fabf5ca5" (UID: "3394e2db-ddcc-4a0a-94d3-9336fabf5ca5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.173291 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ebb453c-017c-43ff-adae-97a1e95903f2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0ebb453c-017c-43ff-adae-97a1e95903f2" (UID: "0ebb453c-017c-43ff-adae-97a1e95903f2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.180453 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bshn5\" (UniqueName: \"kubernetes.io/projected/dc749a45-3392-4273-bcbd-8b637826a220-kube-api-access-bshn5\") on node \"crc\" DevicePath \"\"" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.180501 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ebb453c-017c-43ff-adae-97a1e95903f2-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.180517 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3394e2db-ddcc-4a0a-94d3-9336fabf5ca5-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.180531 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc749a45-3392-4273-bcbd-8b637826a220-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.180545 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ef45b9d-67cb-4257-869b-b6a643b49313-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.180558 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/444eee36-7eda-4b9c-9609-decdc3fb841b-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.180570 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwgd9\" (UniqueName: \"kubernetes.io/projected/7ef45b9d-67cb-4257-869b-b6a643b49313-kube-api-access-mwgd9\") on node \"crc\" DevicePath \"\"" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.180582 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc749a45-3392-4273-bcbd-8b637826a220-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.180596 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3394e2db-ddcc-4a0a-94d3-9336fabf5ca5-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.180608 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tn87l\" (UniqueName: \"kubernetes.io/projected/3394e2db-ddcc-4a0a-94d3-9336fabf5ca5-kube-api-access-tn87l\") on node \"crc\" DevicePath \"\"" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.196891 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ef45b9d-67cb-4257-869b-b6a643b49313-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7ef45b9d-67cb-4257-869b-b6a643b49313" (UID: "7ef45b9d-67cb-4257-869b-b6a643b49313"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.282562 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ef45b9d-67cb-4257-869b-b6a643b49313-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.599080 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4dvpp" event={"ID":"3fca2cfb-d582-4dbb-ab4c-199316fce981","Type":"ContainerDied","Data":"dc2b778e72f750da7f59b72e0d942d00238358a3d13c57d081a7de189c6811da"} Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.599171 4699 scope.go:117] "RemoveContainer" containerID="51574152e38ba657d6b627db4abadc80e9a9c01d5555c833d66f75dee930cfbc" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.599127 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4dvpp" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.601311 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8pl4v" event={"ID":"cfa86e4d-ee3e-4839-af4e-966184a73dc9","Type":"ContainerStarted","Data":"683ad12b9612373c578d0cf39efecf6da0e259b06cfd8e134b48b8a8bda17aec"} Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.601577 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-8pl4v" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.604663 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bnvlx" event={"ID":"3394e2db-ddcc-4a0a-94d3-9336fabf5ca5","Type":"ContainerDied","Data":"e4d0c945ed188dc0642df27c32c45b7edeb214154c1e4f09af51745e6611f6e3"} Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.607589 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bnvlx" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.607958 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-8pl4v" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.610478 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fd7rc" event={"ID":"444eee36-7eda-4b9c-9609-decdc3fb841b","Type":"ContainerDied","Data":"e96d78e303782d2f04a118ac289b072eee0a5d3fad25753d29d600a1b9a9a223"} Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.610526 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fd7rc" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.618752 4699 scope.go:117] "RemoveContainer" containerID="7a42cab7bc1394974dbcc29ded0ccbfa29bf3fdc814f504c54baebaeeb7941c4" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.627811 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zbxdq" event={"ID":"7ef45b9d-67cb-4257-869b-b6a643b49313","Type":"ContainerDied","Data":"5a45fac6a1b3836852d4359096ad5b6ca29cd431001d82c30009bfd439291517"} Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.627934 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zbxdq" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.629533 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-8pl4v" podStartSLOduration=2.629518665 podStartE2EDuration="2.629518665s" podCreationTimestamp="2025-11-22 04:11:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:11:52.62062602 +0000 UTC m=+263.963247227" watchObservedRunningTime="2025-11-22 04:11:52.629518665 +0000 UTC m=+263.972139862" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.634786 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fw85k" event={"ID":"dc749a45-3392-4273-bcbd-8b637826a220","Type":"ContainerDied","Data":"d1adb9f6c467103c566f9381011e766bda5b14426aec46d1d617ae20e138ee99"} Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.634912 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fw85k" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.639025 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8rqfc" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.639030 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rqfc" event={"ID":"7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45","Type":"ContainerDied","Data":"44919da89f0a7edcb99098cd62f83826f3e242bc96a11d9d314f0422597e6189"} Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.645491 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qk9l" event={"ID":"0ebb453c-017c-43ff-adae-97a1e95903f2","Type":"ContainerDied","Data":"431a13b2392f84339794eaf36fc57e7debda2526e026bfa5932f80dc23766cfa"} Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.645887 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5qk9l" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.645992 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4dvpp"] Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.648899 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4dvpp"] Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.650224 4699 scope.go:117] "RemoveContainer" containerID="cc61932655e811e4a71f2ca538cf89c1ac271981777e6f886812662c063c8736" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.651896 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwc7g" event={"ID":"5e9094ed-d247-4427-86ce-adf048713377","Type":"ContainerDied","Data":"f7268dffb74f9895faf2867f70fbd436ef488c2e878eef075f7a3c019006ee6e"} Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.652066 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mwc7g" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.701297 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fd7rc"] Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.716095 4699 scope.go:117] "RemoveContainer" containerID="f4f418868f80920a74633b3fac64f0fa6b90a241c83f2e345347fc8c79a10384" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.727704 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fd7rc"] Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.746823 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zbxdq"] Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.750004 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zbxdq"] Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.751023 4699 scope.go:117] "RemoveContainer" containerID="7637b3d770f0eb15b3b4e3bc8bcb69819ea8aa9c7df230305d0c93fd77eea4ab" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.757367 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fw85k"] Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.758691 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fw85k"] Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.761830 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bnvlx"] Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.765014 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bnvlx"] Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.767693 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mwc7g"] Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.770065 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mwc7g"] Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.780763 4699 scope.go:117] "RemoveContainer" containerID="32ca3c3e137d0590c62134e67ec1fd779109848422b0933481b5ec46e9c63113" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.781610 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8rqfc"] Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.783899 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8rqfc"] Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.796383 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5qk9l"] Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.799252 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5qk9l"] Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.806831 4699 scope.go:117] "RemoveContainer" containerID="803cf932fdf1ff2ae134e5894f055df72cf42dfe824b356506546454f43f309d" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.823972 4699 scope.go:117] "RemoveContainer" containerID="247996c1bea46080a18deb2f3f9de58e95fb7646c85ef816db28dad475c95afe" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.843913 4699 scope.go:117] "RemoveContainer" containerID="a1303db5d39f4d15e73db45dde168308020e7b49e333e7297993ea72bb1a4382" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.859952 4699 scope.go:117] "RemoveContainer" containerID="0e3d31650faf30c90051c221607a8f18afa9ca91a40adfbe8a82ad426ab564c6" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.879621 4699 scope.go:117] "RemoveContainer" containerID="e775131c788b4f37bd085c73de0a1c128038b8106e3d45998d8fea5c4e1d0dce" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.894924 4699 scope.go:117] "RemoveContainer" containerID="978a225bcf5072be231d7bf6caa34f4cf65605faa13da9a4fa000b945b4ea0d2" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.908237 4699 scope.go:117] "RemoveContainer" containerID="092a5bed6b16cbc8225b8acb95929067ae78ceb1913b4e3ec8bf379386318d96" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.923987 4699 scope.go:117] "RemoveContainer" containerID="cacc1bd0bbbf1a0a12481f9eafa810e47aac6bda3656196d1833c6287af7b216" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.938985 4699 scope.go:117] "RemoveContainer" containerID="2344d0be7657c9892e1179b6dac457feb262990de1cd708dc59a5612428062aa" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.958457 4699 scope.go:117] "RemoveContainer" containerID="ce86b39860c2e07921e0e92d7d7e83251c510b86b55876a6e072cdc7ee2911f5" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.973777 4699 scope.go:117] "RemoveContainer" containerID="7eb8a0cdf3373d7257739802c4c1a4df7f2a9caa5dbc392d9e456e7f8dca8042" Nov 22 04:11:52 crc kubenswrapper[4699]: I1122 04:11:52.991351 4699 scope.go:117] "RemoveContainer" containerID="6efaea8749b24d24a90622f831dddd6c5d4823adedec4cc03d446080d13417b0" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.014604 4699 scope.go:117] "RemoveContainer" containerID="ac73fdcd70ebb9b13a51614dd0e3f9c1871574554734f8dccdc80adf269ceba9" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.034764 4699 scope.go:117] "RemoveContainer" containerID="5798dd212fea62a966b56c5515d17e786d85cb93df20ca5385f0b795080c454f" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.057732 4699 scope.go:117] "RemoveContainer" containerID="e04b14c8976e9ca7dc7eb9873191b725273be7ed161c2e3e738c6a9d8aa97cf7" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.081730 4699 scope.go:117] "RemoveContainer" containerID="8a771fe6cdb8d1754c51ba127afc7488bec05ba1b1764710a470a771f6149150" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.474284 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ebb453c-017c-43ff-adae-97a1e95903f2" path="/var/lib/kubelet/pods/0ebb453c-017c-43ff-adae-97a1e95903f2/volumes" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.475005 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3394e2db-ddcc-4a0a-94d3-9336fabf5ca5" path="/var/lib/kubelet/pods/3394e2db-ddcc-4a0a-94d3-9336fabf5ca5/volumes" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.475638 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fca2cfb-d582-4dbb-ab4c-199316fce981" path="/var/lib/kubelet/pods/3fca2cfb-d582-4dbb-ab4c-199316fce981/volumes" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.476641 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="444eee36-7eda-4b9c-9609-decdc3fb841b" path="/var/lib/kubelet/pods/444eee36-7eda-4b9c-9609-decdc3fb841b/volumes" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.477327 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e9094ed-d247-4427-86ce-adf048713377" path="/var/lib/kubelet/pods/5e9094ed-d247-4427-86ce-adf048713377/volumes" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.478629 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45" path="/var/lib/kubelet/pods/7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45/volumes" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.479402 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ef45b9d-67cb-4257-869b-b6a643b49313" path="/var/lib/kubelet/pods/7ef45b9d-67cb-4257-869b-b6a643b49313/volumes" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.480025 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc749a45-3392-4273-bcbd-8b637826a220" path="/var/lib/kubelet/pods/dc749a45-3392-4273-bcbd-8b637826a220/volumes" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.481338 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-js84l"] Nov 22 04:11:53 crc kubenswrapper[4699]: E1122 04:11:53.481577 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="444eee36-7eda-4b9c-9609-decdc3fb841b" containerName="registry-server" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.481595 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="444eee36-7eda-4b9c-9609-decdc3fb841b" containerName="registry-server" Nov 22 04:11:53 crc kubenswrapper[4699]: E1122 04:11:53.481605 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="444eee36-7eda-4b9c-9609-decdc3fb841b" containerName="extract-utilities" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.481611 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="444eee36-7eda-4b9c-9609-decdc3fb841b" containerName="extract-utilities" Nov 22 04:11:53 crc kubenswrapper[4699]: E1122 04:11:53.481619 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ebb453c-017c-43ff-adae-97a1e95903f2" containerName="registry-server" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.481625 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ebb453c-017c-43ff-adae-97a1e95903f2" containerName="registry-server" Nov 22 04:11:53 crc kubenswrapper[4699]: E1122 04:11:53.481631 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3394e2db-ddcc-4a0a-94d3-9336fabf5ca5" containerName="registry-server" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.481637 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="3394e2db-ddcc-4a0a-94d3-9336fabf5ca5" containerName="registry-server" Nov 22 04:11:53 crc kubenswrapper[4699]: E1122 04:11:53.481646 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45" containerName="registry-server" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.481652 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45" containerName="registry-server" Nov 22 04:11:53 crc kubenswrapper[4699]: E1122 04:11:53.481664 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3394e2db-ddcc-4a0a-94d3-9336fabf5ca5" containerName="extract-content" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.481670 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="3394e2db-ddcc-4a0a-94d3-9336fabf5ca5" containerName="extract-content" Nov 22 04:11:53 crc kubenswrapper[4699]: E1122 04:11:53.481679 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="444eee36-7eda-4b9c-9609-decdc3fb841b" containerName="extract-content" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.481685 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="444eee36-7eda-4b9c-9609-decdc3fb841b" containerName="extract-content" Nov 22 04:11:53 crc kubenswrapper[4699]: E1122 04:11:53.481695 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ef45b9d-67cb-4257-869b-b6a643b49313" containerName="extract-content" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.481701 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ef45b9d-67cb-4257-869b-b6a643b49313" containerName="extract-content" Nov 22 04:11:53 crc kubenswrapper[4699]: E1122 04:11:53.481711 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ef45b9d-67cb-4257-869b-b6a643b49313" containerName="registry-server" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.481716 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ef45b9d-67cb-4257-869b-b6a643b49313" containerName="registry-server" Nov 22 04:11:53 crc kubenswrapper[4699]: E1122 04:11:53.481726 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ebb453c-017c-43ff-adae-97a1e95903f2" containerName="extract-content" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.481732 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ebb453c-017c-43ff-adae-97a1e95903f2" containerName="extract-content" Nov 22 04:11:53 crc kubenswrapper[4699]: E1122 04:11:53.481739 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fca2cfb-d582-4dbb-ab4c-199316fce981" containerName="marketplace-operator" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.481745 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fca2cfb-d582-4dbb-ab4c-199316fce981" containerName="marketplace-operator" Nov 22 04:11:53 crc kubenswrapper[4699]: E1122 04:11:53.481754 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc749a45-3392-4273-bcbd-8b637826a220" containerName="extract-utilities" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.481761 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc749a45-3392-4273-bcbd-8b637826a220" containerName="extract-utilities" Nov 22 04:11:53 crc kubenswrapper[4699]: E1122 04:11:53.481770 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc749a45-3392-4273-bcbd-8b637826a220" containerName="extract-content" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.481775 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc749a45-3392-4273-bcbd-8b637826a220" containerName="extract-content" Nov 22 04:11:53 crc kubenswrapper[4699]: E1122 04:11:53.481786 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ebb453c-017c-43ff-adae-97a1e95903f2" containerName="extract-utilities" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.481791 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ebb453c-017c-43ff-adae-97a1e95903f2" containerName="extract-utilities" Nov 22 04:11:53 crc kubenswrapper[4699]: E1122 04:11:53.481866 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3394e2db-ddcc-4a0a-94d3-9336fabf5ca5" containerName="extract-utilities" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.481884 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="3394e2db-ddcc-4a0a-94d3-9336fabf5ca5" containerName="extract-utilities" Nov 22 04:11:53 crc kubenswrapper[4699]: E1122 04:11:53.481893 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45" containerName="extract-content" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.481902 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45" containerName="extract-content" Nov 22 04:11:53 crc kubenswrapper[4699]: E1122 04:11:53.481910 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e9094ed-d247-4427-86ce-adf048713377" containerName="registry-server" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.481918 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e9094ed-d247-4427-86ce-adf048713377" containerName="registry-server" Nov 22 04:11:53 crc kubenswrapper[4699]: E1122 04:11:53.481927 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc749a45-3392-4273-bcbd-8b637826a220" containerName="registry-server" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.481934 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc749a45-3392-4273-bcbd-8b637826a220" containerName="registry-server" Nov 22 04:11:53 crc kubenswrapper[4699]: E1122 04:11:53.481944 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e9094ed-d247-4427-86ce-adf048713377" containerName="extract-utilities" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.481951 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e9094ed-d247-4427-86ce-adf048713377" containerName="extract-utilities" Nov 22 04:11:53 crc kubenswrapper[4699]: E1122 04:11:53.481959 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ef45b9d-67cb-4257-869b-b6a643b49313" containerName="extract-utilities" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.481966 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ef45b9d-67cb-4257-869b-b6a643b49313" containerName="extract-utilities" Nov 22 04:11:53 crc kubenswrapper[4699]: E1122 04:11:53.481972 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45" containerName="extract-utilities" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.481979 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45" containerName="extract-utilities" Nov 22 04:11:53 crc kubenswrapper[4699]: E1122 04:11:53.481985 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e9094ed-d247-4427-86ce-adf048713377" containerName="extract-content" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.481990 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e9094ed-d247-4427-86ce-adf048713377" containerName="extract-content" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.482095 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="3394e2db-ddcc-4a0a-94d3-9336fabf5ca5" containerName="registry-server" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.482106 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ebb453c-017c-43ff-adae-97a1e95903f2" containerName="registry-server" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.482118 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc749a45-3392-4273-bcbd-8b637826a220" containerName="registry-server" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.482126 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e9094ed-d247-4427-86ce-adf048713377" containerName="registry-server" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.482137 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ef45b9d-67cb-4257-869b-b6a643b49313" containerName="registry-server" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.482143 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b7c99b8-d587-4b74-a9ad-6f6b3d8bab45" containerName="registry-server" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.482151 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fca2cfb-d582-4dbb-ab4c-199316fce981" containerName="marketplace-operator" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.482156 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="444eee36-7eda-4b9c-9609-decdc3fb841b" containerName="registry-server" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.483077 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-js84l"] Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.483185 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-js84l" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.486772 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.609412 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6sjj\" (UniqueName: \"kubernetes.io/projected/096fc045-af3a-4dff-bfb9-aad031dc0cc0-kube-api-access-p6sjj\") pod \"redhat-marketplace-js84l\" (UID: \"096fc045-af3a-4dff-bfb9-aad031dc0cc0\") " pod="openshift-marketplace/redhat-marketplace-js84l" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.609508 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/096fc045-af3a-4dff-bfb9-aad031dc0cc0-utilities\") pod \"redhat-marketplace-js84l\" (UID: \"096fc045-af3a-4dff-bfb9-aad031dc0cc0\") " pod="openshift-marketplace/redhat-marketplace-js84l" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.609601 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/096fc045-af3a-4dff-bfb9-aad031dc0cc0-catalog-content\") pod \"redhat-marketplace-js84l\" (UID: \"096fc045-af3a-4dff-bfb9-aad031dc0cc0\") " pod="openshift-marketplace/redhat-marketplace-js84l" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.711470 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/096fc045-af3a-4dff-bfb9-aad031dc0cc0-utilities\") pod \"redhat-marketplace-js84l\" (UID: \"096fc045-af3a-4dff-bfb9-aad031dc0cc0\") " pod="openshift-marketplace/redhat-marketplace-js84l" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.711578 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/096fc045-af3a-4dff-bfb9-aad031dc0cc0-catalog-content\") pod \"redhat-marketplace-js84l\" (UID: \"096fc045-af3a-4dff-bfb9-aad031dc0cc0\") " pod="openshift-marketplace/redhat-marketplace-js84l" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.711615 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6sjj\" (UniqueName: \"kubernetes.io/projected/096fc045-af3a-4dff-bfb9-aad031dc0cc0-kube-api-access-p6sjj\") pod \"redhat-marketplace-js84l\" (UID: \"096fc045-af3a-4dff-bfb9-aad031dc0cc0\") " pod="openshift-marketplace/redhat-marketplace-js84l" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.712334 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/096fc045-af3a-4dff-bfb9-aad031dc0cc0-utilities\") pod \"redhat-marketplace-js84l\" (UID: \"096fc045-af3a-4dff-bfb9-aad031dc0cc0\") " pod="openshift-marketplace/redhat-marketplace-js84l" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.712402 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/096fc045-af3a-4dff-bfb9-aad031dc0cc0-catalog-content\") pod \"redhat-marketplace-js84l\" (UID: \"096fc045-af3a-4dff-bfb9-aad031dc0cc0\") " pod="openshift-marketplace/redhat-marketplace-js84l" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.738346 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6sjj\" (UniqueName: \"kubernetes.io/projected/096fc045-af3a-4dff-bfb9-aad031dc0cc0-kube-api-access-p6sjj\") pod \"redhat-marketplace-js84l\" (UID: \"096fc045-af3a-4dff-bfb9-aad031dc0cc0\") " pod="openshift-marketplace/redhat-marketplace-js84l" Nov 22 04:11:53 crc kubenswrapper[4699]: I1122 04:11:53.801712 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-js84l" Nov 22 04:11:54 crc kubenswrapper[4699]: I1122 04:11:54.277650 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-js84l"] Nov 22 04:11:54 crc kubenswrapper[4699]: I1122 04:11:54.440807 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wdh9z"] Nov 22 04:11:54 crc kubenswrapper[4699]: I1122 04:11:54.442361 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wdh9z" Nov 22 04:11:54 crc kubenswrapper[4699]: I1122 04:11:54.444807 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 22 04:11:54 crc kubenswrapper[4699]: I1122 04:11:54.459679 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wdh9z"] Nov 22 04:11:54 crc kubenswrapper[4699]: I1122 04:11:54.622821 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ee0f1f9-a840-4cb2-828e-99b87f87d60e-utilities\") pod \"redhat-operators-wdh9z\" (UID: \"8ee0f1f9-a840-4cb2-828e-99b87f87d60e\") " pod="openshift-marketplace/redhat-operators-wdh9z" Nov 22 04:11:54 crc kubenswrapper[4699]: I1122 04:11:54.623025 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ee0f1f9-a840-4cb2-828e-99b87f87d60e-catalog-content\") pod \"redhat-operators-wdh9z\" (UID: \"8ee0f1f9-a840-4cb2-828e-99b87f87d60e\") " pod="openshift-marketplace/redhat-operators-wdh9z" Nov 22 04:11:54 crc kubenswrapper[4699]: I1122 04:11:54.623083 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfqwk\" (UniqueName: \"kubernetes.io/projected/8ee0f1f9-a840-4cb2-828e-99b87f87d60e-kube-api-access-bfqwk\") pod \"redhat-operators-wdh9z\" (UID: \"8ee0f1f9-a840-4cb2-828e-99b87f87d60e\") " pod="openshift-marketplace/redhat-operators-wdh9z" Nov 22 04:11:54 crc kubenswrapper[4699]: I1122 04:11:54.677417 4699 generic.go:334] "Generic (PLEG): container finished" podID="096fc045-af3a-4dff-bfb9-aad031dc0cc0" containerID="fb734aad73ed95e456f796acb6674340915fe780ed76bc7faafc45530e9187ee" exitCode=0 Nov 22 04:11:54 crc kubenswrapper[4699]: I1122 04:11:54.677516 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-js84l" event={"ID":"096fc045-af3a-4dff-bfb9-aad031dc0cc0","Type":"ContainerDied","Data":"fb734aad73ed95e456f796acb6674340915fe780ed76bc7faafc45530e9187ee"} Nov 22 04:11:54 crc kubenswrapper[4699]: I1122 04:11:54.677605 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-js84l" event={"ID":"096fc045-af3a-4dff-bfb9-aad031dc0cc0","Type":"ContainerStarted","Data":"78c63e9dab5ab1177016e0b10ac4805f75c42d5be984887d80749194fe548c11"} Nov 22 04:11:54 crc kubenswrapper[4699]: I1122 04:11:54.724202 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ee0f1f9-a840-4cb2-828e-99b87f87d60e-catalog-content\") pod \"redhat-operators-wdh9z\" (UID: \"8ee0f1f9-a840-4cb2-828e-99b87f87d60e\") " pod="openshift-marketplace/redhat-operators-wdh9z" Nov 22 04:11:54 crc kubenswrapper[4699]: I1122 04:11:54.724268 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfqwk\" (UniqueName: \"kubernetes.io/projected/8ee0f1f9-a840-4cb2-828e-99b87f87d60e-kube-api-access-bfqwk\") pod \"redhat-operators-wdh9z\" (UID: \"8ee0f1f9-a840-4cb2-828e-99b87f87d60e\") " pod="openshift-marketplace/redhat-operators-wdh9z" Nov 22 04:11:54 crc kubenswrapper[4699]: I1122 04:11:54.724734 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ee0f1f9-a840-4cb2-828e-99b87f87d60e-utilities\") pod \"redhat-operators-wdh9z\" (UID: \"8ee0f1f9-a840-4cb2-828e-99b87f87d60e\") " pod="openshift-marketplace/redhat-operators-wdh9z" Nov 22 04:11:54 crc kubenswrapper[4699]: I1122 04:11:54.724773 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ee0f1f9-a840-4cb2-828e-99b87f87d60e-catalog-content\") pod \"redhat-operators-wdh9z\" (UID: \"8ee0f1f9-a840-4cb2-828e-99b87f87d60e\") " pod="openshift-marketplace/redhat-operators-wdh9z" Nov 22 04:11:54 crc kubenswrapper[4699]: I1122 04:11:54.725080 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ee0f1f9-a840-4cb2-828e-99b87f87d60e-utilities\") pod \"redhat-operators-wdh9z\" (UID: \"8ee0f1f9-a840-4cb2-828e-99b87f87d60e\") " pod="openshift-marketplace/redhat-operators-wdh9z" Nov 22 04:11:54 crc kubenswrapper[4699]: I1122 04:11:54.749990 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfqwk\" (UniqueName: \"kubernetes.io/projected/8ee0f1f9-a840-4cb2-828e-99b87f87d60e-kube-api-access-bfqwk\") pod \"redhat-operators-wdh9z\" (UID: \"8ee0f1f9-a840-4cb2-828e-99b87f87d60e\") " pod="openshift-marketplace/redhat-operators-wdh9z" Nov 22 04:11:54 crc kubenswrapper[4699]: I1122 04:11:54.821718 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wdh9z" Nov 22 04:11:55 crc kubenswrapper[4699]: I1122 04:11:55.054412 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wdh9z"] Nov 22 04:11:55 crc kubenswrapper[4699]: W1122 04:11:55.068056 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ee0f1f9_a840_4cb2_828e_99b87f87d60e.slice/crio-064da89362763665db53627290df3f0e535bcd62637fd5df2f55065681813bd7 WatchSource:0}: Error finding container 064da89362763665db53627290df3f0e535bcd62637fd5df2f55065681813bd7: Status 404 returned error can't find the container with id 064da89362763665db53627290df3f0e535bcd62637fd5df2f55065681813bd7 Nov 22 04:11:55 crc kubenswrapper[4699]: I1122 04:11:55.685320 4699 generic.go:334] "Generic (PLEG): container finished" podID="8ee0f1f9-a840-4cb2-828e-99b87f87d60e" containerID="f436015d099737da3622f3d6ce072fcc730e9eb09f6f822a0cf81ba49832bff6" exitCode=0 Nov 22 04:11:55 crc kubenswrapper[4699]: I1122 04:11:55.685548 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wdh9z" event={"ID":"8ee0f1f9-a840-4cb2-828e-99b87f87d60e","Type":"ContainerDied","Data":"f436015d099737da3622f3d6ce072fcc730e9eb09f6f822a0cf81ba49832bff6"} Nov 22 04:11:55 crc kubenswrapper[4699]: I1122 04:11:55.685620 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wdh9z" event={"ID":"8ee0f1f9-a840-4cb2-828e-99b87f87d60e","Type":"ContainerStarted","Data":"064da89362763665db53627290df3f0e535bcd62637fd5df2f55065681813bd7"} Nov 22 04:11:55 crc kubenswrapper[4699]: I1122 04:11:55.691731 4699 generic.go:334] "Generic (PLEG): container finished" podID="096fc045-af3a-4dff-bfb9-aad031dc0cc0" containerID="a0c69aeddbbd8704a7eb946c911daca4377c72416f4c8e3cc295420f6e100aa7" exitCode=0 Nov 22 04:11:55 crc kubenswrapper[4699]: I1122 04:11:55.691787 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-js84l" event={"ID":"096fc045-af3a-4dff-bfb9-aad031dc0cc0","Type":"ContainerDied","Data":"a0c69aeddbbd8704a7eb946c911daca4377c72416f4c8e3cc295420f6e100aa7"} Nov 22 04:11:55 crc kubenswrapper[4699]: I1122 04:11:55.847776 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vzxlt"] Nov 22 04:11:55 crc kubenswrapper[4699]: I1122 04:11:55.854145 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vzxlt"] Nov 22 04:11:55 crc kubenswrapper[4699]: I1122 04:11:55.854346 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vzxlt" Nov 22 04:11:55 crc kubenswrapper[4699]: I1122 04:11:55.857362 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 22 04:11:56 crc kubenswrapper[4699]: I1122 04:11:56.043880 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnqwm\" (UniqueName: \"kubernetes.io/projected/9aa53af0-a1c6-48c6-b081-e100d6a6512f-kube-api-access-bnqwm\") pod \"certified-operators-vzxlt\" (UID: \"9aa53af0-a1c6-48c6-b081-e100d6a6512f\") " pod="openshift-marketplace/certified-operators-vzxlt" Nov 22 04:11:56 crc kubenswrapper[4699]: I1122 04:11:56.043979 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9aa53af0-a1c6-48c6-b081-e100d6a6512f-utilities\") pod \"certified-operators-vzxlt\" (UID: \"9aa53af0-a1c6-48c6-b081-e100d6a6512f\") " pod="openshift-marketplace/certified-operators-vzxlt" Nov 22 04:11:56 crc kubenswrapper[4699]: I1122 04:11:56.044158 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9aa53af0-a1c6-48c6-b081-e100d6a6512f-catalog-content\") pod \"certified-operators-vzxlt\" (UID: \"9aa53af0-a1c6-48c6-b081-e100d6a6512f\") " pod="openshift-marketplace/certified-operators-vzxlt" Nov 22 04:11:56 crc kubenswrapper[4699]: I1122 04:11:56.146013 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9aa53af0-a1c6-48c6-b081-e100d6a6512f-utilities\") pod \"certified-operators-vzxlt\" (UID: \"9aa53af0-a1c6-48c6-b081-e100d6a6512f\") " pod="openshift-marketplace/certified-operators-vzxlt" Nov 22 04:11:56 crc kubenswrapper[4699]: I1122 04:11:56.146202 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9aa53af0-a1c6-48c6-b081-e100d6a6512f-catalog-content\") pod \"certified-operators-vzxlt\" (UID: \"9aa53af0-a1c6-48c6-b081-e100d6a6512f\") " pod="openshift-marketplace/certified-operators-vzxlt" Nov 22 04:11:56 crc kubenswrapper[4699]: I1122 04:11:56.146273 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnqwm\" (UniqueName: \"kubernetes.io/projected/9aa53af0-a1c6-48c6-b081-e100d6a6512f-kube-api-access-bnqwm\") pod \"certified-operators-vzxlt\" (UID: \"9aa53af0-a1c6-48c6-b081-e100d6a6512f\") " pod="openshift-marketplace/certified-operators-vzxlt" Nov 22 04:11:56 crc kubenswrapper[4699]: I1122 04:11:56.146717 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9aa53af0-a1c6-48c6-b081-e100d6a6512f-utilities\") pod \"certified-operators-vzxlt\" (UID: \"9aa53af0-a1c6-48c6-b081-e100d6a6512f\") " pod="openshift-marketplace/certified-operators-vzxlt" Nov 22 04:11:56 crc kubenswrapper[4699]: I1122 04:11:56.147026 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9aa53af0-a1c6-48c6-b081-e100d6a6512f-catalog-content\") pod \"certified-operators-vzxlt\" (UID: \"9aa53af0-a1c6-48c6-b081-e100d6a6512f\") " pod="openshift-marketplace/certified-operators-vzxlt" Nov 22 04:11:56 crc kubenswrapper[4699]: I1122 04:11:56.179970 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnqwm\" (UniqueName: \"kubernetes.io/projected/9aa53af0-a1c6-48c6-b081-e100d6a6512f-kube-api-access-bnqwm\") pod \"certified-operators-vzxlt\" (UID: \"9aa53af0-a1c6-48c6-b081-e100d6a6512f\") " pod="openshift-marketplace/certified-operators-vzxlt" Nov 22 04:11:56 crc kubenswrapper[4699]: I1122 04:11:56.184594 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vzxlt" Nov 22 04:11:56 crc kubenswrapper[4699]: I1122 04:11:56.420330 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vzxlt"] Nov 22 04:11:56 crc kubenswrapper[4699]: I1122 04:11:56.699165 4699 generic.go:334] "Generic (PLEG): container finished" podID="9aa53af0-a1c6-48c6-b081-e100d6a6512f" containerID="e75aa136c05f855f02140ce4c055da9ee3f6180968944497ec4fbd3e68539dbf" exitCode=0 Nov 22 04:11:56 crc kubenswrapper[4699]: I1122 04:11:56.699303 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzxlt" event={"ID":"9aa53af0-a1c6-48c6-b081-e100d6a6512f","Type":"ContainerDied","Data":"e75aa136c05f855f02140ce4c055da9ee3f6180968944497ec4fbd3e68539dbf"} Nov 22 04:11:56 crc kubenswrapper[4699]: I1122 04:11:56.699862 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzxlt" event={"ID":"9aa53af0-a1c6-48c6-b081-e100d6a6512f","Type":"ContainerStarted","Data":"19000d8605966726617a6cedd6b33ea8ddf5b4d07f53d4611b26e02497f38616"} Nov 22 04:11:56 crc kubenswrapper[4699]: I1122 04:11:56.703945 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wdh9z" event={"ID":"8ee0f1f9-a840-4cb2-828e-99b87f87d60e","Type":"ContainerStarted","Data":"c2d52d1c15fa7fbb67e1482f0b1916a5f18a6cdbd78209380da24da0ace0ac9f"} Nov 22 04:11:56 crc kubenswrapper[4699]: I1122 04:11:56.707157 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-js84l" event={"ID":"096fc045-af3a-4dff-bfb9-aad031dc0cc0","Type":"ContainerStarted","Data":"2fa634f379f496113533c66af315508ddf2da08811734098a24537aac0607a54"} Nov 22 04:11:56 crc kubenswrapper[4699]: I1122 04:11:56.768077 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-js84l" podStartSLOduration=2.341702757 podStartE2EDuration="3.768053551s" podCreationTimestamp="2025-11-22 04:11:53 +0000 UTC" firstStartedPulling="2025-11-22 04:11:54.679562341 +0000 UTC m=+266.022183528" lastFinishedPulling="2025-11-22 04:11:56.105913105 +0000 UTC m=+267.448534322" observedRunningTime="2025-11-22 04:11:56.766038103 +0000 UTC m=+268.108659300" watchObservedRunningTime="2025-11-22 04:11:56.768053551 +0000 UTC m=+268.110674738" Nov 22 04:11:56 crc kubenswrapper[4699]: I1122 04:11:56.845474 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wzwkb"] Nov 22 04:11:56 crc kubenswrapper[4699]: I1122 04:11:56.846850 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wzwkb" Nov 22 04:11:56 crc kubenswrapper[4699]: I1122 04:11:56.852977 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 22 04:11:56 crc kubenswrapper[4699]: I1122 04:11:56.861028 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wzwkb"] Nov 22 04:11:56 crc kubenswrapper[4699]: I1122 04:11:56.959098 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e5709de-6870-45ee-979d-4cf3c01d2b20-utilities\") pod \"community-operators-wzwkb\" (UID: \"4e5709de-6870-45ee-979d-4cf3c01d2b20\") " pod="openshift-marketplace/community-operators-wzwkb" Nov 22 04:11:56 crc kubenswrapper[4699]: I1122 04:11:56.959189 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbw6c\" (UniqueName: \"kubernetes.io/projected/4e5709de-6870-45ee-979d-4cf3c01d2b20-kube-api-access-nbw6c\") pod \"community-operators-wzwkb\" (UID: \"4e5709de-6870-45ee-979d-4cf3c01d2b20\") " pod="openshift-marketplace/community-operators-wzwkb" Nov 22 04:11:56 crc kubenswrapper[4699]: I1122 04:11:56.959625 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e5709de-6870-45ee-979d-4cf3c01d2b20-catalog-content\") pod \"community-operators-wzwkb\" (UID: \"4e5709de-6870-45ee-979d-4cf3c01d2b20\") " pod="openshift-marketplace/community-operators-wzwkb" Nov 22 04:11:57 crc kubenswrapper[4699]: I1122 04:11:57.061598 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbw6c\" (UniqueName: \"kubernetes.io/projected/4e5709de-6870-45ee-979d-4cf3c01d2b20-kube-api-access-nbw6c\") pod \"community-operators-wzwkb\" (UID: \"4e5709de-6870-45ee-979d-4cf3c01d2b20\") " pod="openshift-marketplace/community-operators-wzwkb" Nov 22 04:11:57 crc kubenswrapper[4699]: I1122 04:11:57.061693 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e5709de-6870-45ee-979d-4cf3c01d2b20-catalog-content\") pod \"community-operators-wzwkb\" (UID: \"4e5709de-6870-45ee-979d-4cf3c01d2b20\") " pod="openshift-marketplace/community-operators-wzwkb" Nov 22 04:11:57 crc kubenswrapper[4699]: I1122 04:11:57.061723 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e5709de-6870-45ee-979d-4cf3c01d2b20-utilities\") pod \"community-operators-wzwkb\" (UID: \"4e5709de-6870-45ee-979d-4cf3c01d2b20\") " pod="openshift-marketplace/community-operators-wzwkb" Nov 22 04:11:57 crc kubenswrapper[4699]: I1122 04:11:57.062343 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e5709de-6870-45ee-979d-4cf3c01d2b20-utilities\") pod \"community-operators-wzwkb\" (UID: \"4e5709de-6870-45ee-979d-4cf3c01d2b20\") " pod="openshift-marketplace/community-operators-wzwkb" Nov 22 04:11:57 crc kubenswrapper[4699]: I1122 04:11:57.062917 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e5709de-6870-45ee-979d-4cf3c01d2b20-catalog-content\") pod \"community-operators-wzwkb\" (UID: \"4e5709de-6870-45ee-979d-4cf3c01d2b20\") " pod="openshift-marketplace/community-operators-wzwkb" Nov 22 04:11:57 crc kubenswrapper[4699]: I1122 04:11:57.084168 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbw6c\" (UniqueName: \"kubernetes.io/projected/4e5709de-6870-45ee-979d-4cf3c01d2b20-kube-api-access-nbw6c\") pod \"community-operators-wzwkb\" (UID: \"4e5709de-6870-45ee-979d-4cf3c01d2b20\") " pod="openshift-marketplace/community-operators-wzwkb" Nov 22 04:11:57 crc kubenswrapper[4699]: I1122 04:11:57.163102 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wzwkb" Nov 22 04:11:57 crc kubenswrapper[4699]: I1122 04:11:57.366378 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wzwkb"] Nov 22 04:11:57 crc kubenswrapper[4699]: I1122 04:11:57.716355 4699 generic.go:334] "Generic (PLEG): container finished" podID="4e5709de-6870-45ee-979d-4cf3c01d2b20" containerID="8330cc91f10a9f5671197f9d7fde8ffa4c685929aef1dde7d8dfbf8d4040a41a" exitCode=0 Nov 22 04:11:57 crc kubenswrapper[4699]: I1122 04:11:57.716496 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wzwkb" event={"ID":"4e5709de-6870-45ee-979d-4cf3c01d2b20","Type":"ContainerDied","Data":"8330cc91f10a9f5671197f9d7fde8ffa4c685929aef1dde7d8dfbf8d4040a41a"} Nov 22 04:11:57 crc kubenswrapper[4699]: I1122 04:11:57.716957 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wzwkb" event={"ID":"4e5709de-6870-45ee-979d-4cf3c01d2b20","Type":"ContainerStarted","Data":"ed4214bd707a9ad9a8ce088ea083495ac71219e3ab3b3ecb3b9508b74c3c01bd"} Nov 22 04:11:57 crc kubenswrapper[4699]: I1122 04:11:57.719025 4699 generic.go:334] "Generic (PLEG): container finished" podID="8ee0f1f9-a840-4cb2-828e-99b87f87d60e" containerID="c2d52d1c15fa7fbb67e1482f0b1916a5f18a6cdbd78209380da24da0ace0ac9f" exitCode=0 Nov 22 04:11:57 crc kubenswrapper[4699]: I1122 04:11:57.719120 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wdh9z" event={"ID":"8ee0f1f9-a840-4cb2-828e-99b87f87d60e","Type":"ContainerDied","Data":"c2d52d1c15fa7fbb67e1482f0b1916a5f18a6cdbd78209380da24da0ace0ac9f"} Nov 22 04:11:57 crc kubenswrapper[4699]: I1122 04:11:57.729219 4699 generic.go:334] "Generic (PLEG): container finished" podID="9aa53af0-a1c6-48c6-b081-e100d6a6512f" containerID="9e80c6e1a130bd83a5f29d40b12c3a2f524e95a52391f00bd12620acef0b4d84" exitCode=0 Nov 22 04:11:57 crc kubenswrapper[4699]: I1122 04:11:57.729282 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzxlt" event={"ID":"9aa53af0-a1c6-48c6-b081-e100d6a6512f","Type":"ContainerDied","Data":"9e80c6e1a130bd83a5f29d40b12c3a2f524e95a52391f00bd12620acef0b4d84"} Nov 22 04:11:58 crc kubenswrapper[4699]: I1122 04:11:58.737639 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzxlt" event={"ID":"9aa53af0-a1c6-48c6-b081-e100d6a6512f","Type":"ContainerStarted","Data":"baaa8e538dd9b1d567e3f14ca9aa09ff71049af473a4de3898d01efa5865cd40"} Nov 22 04:11:58 crc kubenswrapper[4699]: I1122 04:11:58.744270 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wzwkb" event={"ID":"4e5709de-6870-45ee-979d-4cf3c01d2b20","Type":"ContainerStarted","Data":"05b7d6d0dc955002bc6010d5a0b3d1490e3d6f492d225dccc84089917a35b029"} Nov 22 04:11:58 crc kubenswrapper[4699]: I1122 04:11:58.749233 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wdh9z" event={"ID":"8ee0f1f9-a840-4cb2-828e-99b87f87d60e","Type":"ContainerStarted","Data":"1ff4dce0d39a2794c24e3b08841a7e9537577141429e359336873cd9ce92c68c"} Nov 22 04:11:58 crc kubenswrapper[4699]: I1122 04:11:58.763610 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vzxlt" podStartSLOduration=2.3638987719999998 podStartE2EDuration="3.763585861s" podCreationTimestamp="2025-11-22 04:11:55 +0000 UTC" firstStartedPulling="2025-11-22 04:11:56.701450317 +0000 UTC m=+268.044071544" lastFinishedPulling="2025-11-22 04:11:58.101137446 +0000 UTC m=+269.443758633" observedRunningTime="2025-11-22 04:11:58.761758258 +0000 UTC m=+270.104379445" watchObservedRunningTime="2025-11-22 04:11:58.763585861 +0000 UTC m=+270.106207048" Nov 22 04:11:58 crc kubenswrapper[4699]: I1122 04:11:58.809582 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wdh9z" podStartSLOduration=2.005125071 podStartE2EDuration="4.809557032s" podCreationTimestamp="2025-11-22 04:11:54 +0000 UTC" firstStartedPulling="2025-11-22 04:11:55.688203664 +0000 UTC m=+267.030824891" lastFinishedPulling="2025-11-22 04:11:58.492635635 +0000 UTC m=+269.835256852" observedRunningTime="2025-11-22 04:11:58.789480635 +0000 UTC m=+270.132101832" watchObservedRunningTime="2025-11-22 04:11:58.809557032 +0000 UTC m=+270.152178219" Nov 22 04:11:59 crc kubenswrapper[4699]: I1122 04:11:59.756543 4699 generic.go:334] "Generic (PLEG): container finished" podID="4e5709de-6870-45ee-979d-4cf3c01d2b20" containerID="05b7d6d0dc955002bc6010d5a0b3d1490e3d6f492d225dccc84089917a35b029" exitCode=0 Nov 22 04:11:59 crc kubenswrapper[4699]: I1122 04:11:59.756611 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wzwkb" event={"ID":"4e5709de-6870-45ee-979d-4cf3c01d2b20","Type":"ContainerDied","Data":"05b7d6d0dc955002bc6010d5a0b3d1490e3d6f492d225dccc84089917a35b029"} Nov 22 04:11:59 crc kubenswrapper[4699]: I1122 04:11:59.757294 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wzwkb" event={"ID":"4e5709de-6870-45ee-979d-4cf3c01d2b20","Type":"ContainerStarted","Data":"25c664b4ac206c1cd023e4b875d8ec080b212120d0acbc3f08a8021e8ff782a7"} Nov 22 04:11:59 crc kubenswrapper[4699]: I1122 04:11:59.777603 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wzwkb" podStartSLOduration=2.344247351 podStartE2EDuration="3.777585866s" podCreationTimestamp="2025-11-22 04:11:56 +0000 UTC" firstStartedPulling="2025-11-22 04:11:57.718133281 +0000 UTC m=+269.060754468" lastFinishedPulling="2025-11-22 04:11:59.151471796 +0000 UTC m=+270.494092983" observedRunningTime="2025-11-22 04:11:59.775456375 +0000 UTC m=+271.118077572" watchObservedRunningTime="2025-11-22 04:11:59.777585866 +0000 UTC m=+271.120207053" Nov 22 04:12:03 crc kubenswrapper[4699]: I1122 04:12:03.802411 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-js84l" Nov 22 04:12:03 crc kubenswrapper[4699]: I1122 04:12:03.803156 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-js84l" Nov 22 04:12:03 crc kubenswrapper[4699]: I1122 04:12:03.846718 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-js84l" Nov 22 04:12:04 crc kubenswrapper[4699]: I1122 04:12:04.822397 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wdh9z" Nov 22 04:12:04 crc kubenswrapper[4699]: I1122 04:12:04.822455 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wdh9z" Nov 22 04:12:04 crc kubenswrapper[4699]: I1122 04:12:04.846199 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-js84l" Nov 22 04:12:04 crc kubenswrapper[4699]: I1122 04:12:04.890071 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wdh9z" Nov 22 04:12:05 crc kubenswrapper[4699]: I1122 04:12:05.838496 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wdh9z" Nov 22 04:12:06 crc kubenswrapper[4699]: I1122 04:12:06.185187 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vzxlt" Nov 22 04:12:06 crc kubenswrapper[4699]: I1122 04:12:06.185259 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vzxlt" Nov 22 04:12:06 crc kubenswrapper[4699]: I1122 04:12:06.233267 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vzxlt" Nov 22 04:12:06 crc kubenswrapper[4699]: I1122 04:12:06.856867 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vzxlt" Nov 22 04:12:07 crc kubenswrapper[4699]: I1122 04:12:07.163571 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wzwkb" Nov 22 04:12:07 crc kubenswrapper[4699]: I1122 04:12:07.163660 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wzwkb" Nov 22 04:12:07 crc kubenswrapper[4699]: I1122 04:12:07.217930 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wzwkb" Nov 22 04:12:07 crc kubenswrapper[4699]: I1122 04:12:07.843328 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wzwkb" Nov 22 04:12:15 crc kubenswrapper[4699]: I1122 04:12:15.977534 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" podUID="b3173c9e-b8ff-4407-bb12-660219ce7a55" containerName="oauth-openshift" containerID="cri-o://1835565d9fd290b7c44c539b1e119eccb5dd882b7419a7ae76fd5ff6f514a514" gracePeriod=15 Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.401471 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.444473 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6499b46898-nlbgm"] Nov 22 04:12:16 crc kubenswrapper[4699]: E1122 04:12:16.445234 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3173c9e-b8ff-4407-bb12-660219ce7a55" containerName="oauth-openshift" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.445369 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3173c9e-b8ff-4407-bb12-660219ce7a55" containerName="oauth-openshift" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.445671 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3173c9e-b8ff-4407-bb12-660219ce7a55" containerName="oauth-openshift" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.446295 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6499b46898-nlbgm" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.458275 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6499b46898-nlbgm"] Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.558241 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-system-serving-cert\") pod \"b3173c9e-b8ff-4407-bb12-660219ce7a55\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.558711 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b3173c9e-b8ff-4407-bb12-660219ce7a55-audit-policies\") pod \"b3173c9e-b8ff-4407-bb12-660219ce7a55\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.558898 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-system-router-certs\") pod \"b3173c9e-b8ff-4407-bb12-660219ce7a55\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.559334 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-system-ocp-branding-template\") pod \"b3173c9e-b8ff-4407-bb12-660219ce7a55\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.559645 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3173c9e-b8ff-4407-bb12-660219ce7a55-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "b3173c9e-b8ff-4407-bb12-660219ce7a55" (UID: "b3173c9e-b8ff-4407-bb12-660219ce7a55"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.559935 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-user-template-provider-selection\") pod \"b3173c9e-b8ff-4407-bb12-660219ce7a55\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.560062 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-user-template-error\") pod \"b3173c9e-b8ff-4407-bb12-660219ce7a55\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.560182 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-user-idp-0-file-data\") pod \"b3173c9e-b8ff-4407-bb12-660219ce7a55\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.560211 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b3173c9e-b8ff-4407-bb12-660219ce7a55-audit-dir\") pod \"b3173c9e-b8ff-4407-bb12-660219ce7a55\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.560239 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-system-cliconfig\") pod \"b3173c9e-b8ff-4407-bb12-660219ce7a55\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.560282 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-system-session\") pod \"b3173c9e-b8ff-4407-bb12-660219ce7a55\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.560308 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-system-trusted-ca-bundle\") pod \"b3173c9e-b8ff-4407-bb12-660219ce7a55\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.560336 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-user-template-login\") pod \"b3173c9e-b8ff-4407-bb12-660219ce7a55\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.560359 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-system-service-ca\") pod \"b3173c9e-b8ff-4407-bb12-660219ce7a55\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.560393 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzwtk\" (UniqueName: \"kubernetes.io/projected/b3173c9e-b8ff-4407-bb12-660219ce7a55-kube-api-access-lzwtk\") pod \"b3173c9e-b8ff-4407-bb12-660219ce7a55\" (UID: \"b3173c9e-b8ff-4407-bb12-660219ce7a55\") " Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.560778 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/83a578b1-1a74-4e27-af0b-348ed5c56605-audit-dir\") pod \"oauth-openshift-6499b46898-nlbgm\" (UID: \"83a578b1-1a74-4e27-af0b-348ed5c56605\") " pod="openshift-authentication/oauth-openshift-6499b46898-nlbgm" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.560888 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/83a578b1-1a74-4e27-af0b-348ed5c56605-v4-0-config-system-session\") pod \"oauth-openshift-6499b46898-nlbgm\" (UID: \"83a578b1-1a74-4e27-af0b-348ed5c56605\") " pod="openshift-authentication/oauth-openshift-6499b46898-nlbgm" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.560908 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/83a578b1-1a74-4e27-af0b-348ed5c56605-v4-0-config-user-template-login\") pod \"oauth-openshift-6499b46898-nlbgm\" (UID: \"83a578b1-1a74-4e27-af0b-348ed5c56605\") " pod="openshift-authentication/oauth-openshift-6499b46898-nlbgm" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.560997 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/83a578b1-1a74-4e27-af0b-348ed5c56605-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6499b46898-nlbgm\" (UID: \"83a578b1-1a74-4e27-af0b-348ed5c56605\") " pod="openshift-authentication/oauth-openshift-6499b46898-nlbgm" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.561023 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/83a578b1-1a74-4e27-af0b-348ed5c56605-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6499b46898-nlbgm\" (UID: \"83a578b1-1a74-4e27-af0b-348ed5c56605\") " pod="openshift-authentication/oauth-openshift-6499b46898-nlbgm" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.561214 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/83a578b1-1a74-4e27-af0b-348ed5c56605-audit-policies\") pod \"oauth-openshift-6499b46898-nlbgm\" (UID: \"83a578b1-1a74-4e27-af0b-348ed5c56605\") " pod="openshift-authentication/oauth-openshift-6499b46898-nlbgm" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.561255 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/83a578b1-1a74-4e27-af0b-348ed5c56605-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6499b46898-nlbgm\" (UID: \"83a578b1-1a74-4e27-af0b-348ed5c56605\") " pod="openshift-authentication/oauth-openshift-6499b46898-nlbgm" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.561300 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/83a578b1-1a74-4e27-af0b-348ed5c56605-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6499b46898-nlbgm\" (UID: \"83a578b1-1a74-4e27-af0b-348ed5c56605\") " pod="openshift-authentication/oauth-openshift-6499b46898-nlbgm" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.561340 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/83a578b1-1a74-4e27-af0b-348ed5c56605-v4-0-config-user-template-error\") pod \"oauth-openshift-6499b46898-nlbgm\" (UID: \"83a578b1-1a74-4e27-af0b-348ed5c56605\") " pod="openshift-authentication/oauth-openshift-6499b46898-nlbgm" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.561364 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/83a578b1-1a74-4e27-af0b-348ed5c56605-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6499b46898-nlbgm\" (UID: \"83a578b1-1a74-4e27-af0b-348ed5c56605\") " pod="openshift-authentication/oauth-openshift-6499b46898-nlbgm" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.561333 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "b3173c9e-b8ff-4407-bb12-660219ce7a55" (UID: "b3173c9e-b8ff-4407-bb12-660219ce7a55"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.561486 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b3173c9e-b8ff-4407-bb12-660219ce7a55-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "b3173c9e-b8ff-4407-bb12-660219ce7a55" (UID: "b3173c9e-b8ff-4407-bb12-660219ce7a55"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.561661 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29phd\" (UniqueName: \"kubernetes.io/projected/83a578b1-1a74-4e27-af0b-348ed5c56605-kube-api-access-29phd\") pod \"oauth-openshift-6499b46898-nlbgm\" (UID: \"83a578b1-1a74-4e27-af0b-348ed5c56605\") " pod="openshift-authentication/oauth-openshift-6499b46898-nlbgm" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.561736 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/83a578b1-1a74-4e27-af0b-348ed5c56605-v4-0-config-system-router-certs\") pod \"oauth-openshift-6499b46898-nlbgm\" (UID: \"83a578b1-1a74-4e27-af0b-348ed5c56605\") " pod="openshift-authentication/oauth-openshift-6499b46898-nlbgm" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.561792 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/83a578b1-1a74-4e27-af0b-348ed5c56605-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6499b46898-nlbgm\" (UID: \"83a578b1-1a74-4e27-af0b-348ed5c56605\") " pod="openshift-authentication/oauth-openshift-6499b46898-nlbgm" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.561893 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/83a578b1-1a74-4e27-af0b-348ed5c56605-v4-0-config-system-service-ca\") pod \"oauth-openshift-6499b46898-nlbgm\" (UID: \"83a578b1-1a74-4e27-af0b-348ed5c56605\") " pod="openshift-authentication/oauth-openshift-6499b46898-nlbgm" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.562041 4699 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b3173c9e-b8ff-4407-bb12-660219ce7a55-audit-dir\") on node \"crc\" DevicePath \"\"" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.562100 4699 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.562141 4699 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b3173c9e-b8ff-4407-bb12-660219ce7a55-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.562500 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "b3173c9e-b8ff-4407-bb12-660219ce7a55" (UID: "b3173c9e-b8ff-4407-bb12-660219ce7a55"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.562923 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "b3173c9e-b8ff-4407-bb12-660219ce7a55" (UID: "b3173c9e-b8ff-4407-bb12-660219ce7a55"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.567074 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "b3173c9e-b8ff-4407-bb12-660219ce7a55" (UID: "b3173c9e-b8ff-4407-bb12-660219ce7a55"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.567124 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3173c9e-b8ff-4407-bb12-660219ce7a55-kube-api-access-lzwtk" (OuterVolumeSpecName: "kube-api-access-lzwtk") pod "b3173c9e-b8ff-4407-bb12-660219ce7a55" (UID: "b3173c9e-b8ff-4407-bb12-660219ce7a55"). InnerVolumeSpecName "kube-api-access-lzwtk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.568811 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "b3173c9e-b8ff-4407-bb12-660219ce7a55" (UID: "b3173c9e-b8ff-4407-bb12-660219ce7a55"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.569254 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "b3173c9e-b8ff-4407-bb12-660219ce7a55" (UID: "b3173c9e-b8ff-4407-bb12-660219ce7a55"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.569631 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "b3173c9e-b8ff-4407-bb12-660219ce7a55" (UID: "b3173c9e-b8ff-4407-bb12-660219ce7a55"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.569846 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "b3173c9e-b8ff-4407-bb12-660219ce7a55" (UID: "b3173c9e-b8ff-4407-bb12-660219ce7a55"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.570701 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "b3173c9e-b8ff-4407-bb12-660219ce7a55" (UID: "b3173c9e-b8ff-4407-bb12-660219ce7a55"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.571365 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "b3173c9e-b8ff-4407-bb12-660219ce7a55" (UID: "b3173c9e-b8ff-4407-bb12-660219ce7a55"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.571747 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "b3173c9e-b8ff-4407-bb12-660219ce7a55" (UID: "b3173c9e-b8ff-4407-bb12-660219ce7a55"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.663385 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/83a578b1-1a74-4e27-af0b-348ed5c56605-v4-0-config-user-template-error\") pod \"oauth-openshift-6499b46898-nlbgm\" (UID: \"83a578b1-1a74-4e27-af0b-348ed5c56605\") " pod="openshift-authentication/oauth-openshift-6499b46898-nlbgm" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.663461 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/83a578b1-1a74-4e27-af0b-348ed5c56605-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6499b46898-nlbgm\" (UID: \"83a578b1-1a74-4e27-af0b-348ed5c56605\") " pod="openshift-authentication/oauth-openshift-6499b46898-nlbgm" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.663497 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29phd\" (UniqueName: \"kubernetes.io/projected/83a578b1-1a74-4e27-af0b-348ed5c56605-kube-api-access-29phd\") pod \"oauth-openshift-6499b46898-nlbgm\" (UID: \"83a578b1-1a74-4e27-af0b-348ed5c56605\") " pod="openshift-authentication/oauth-openshift-6499b46898-nlbgm" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.663515 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/83a578b1-1a74-4e27-af0b-348ed5c56605-v4-0-config-system-router-certs\") pod \"oauth-openshift-6499b46898-nlbgm\" (UID: \"83a578b1-1a74-4e27-af0b-348ed5c56605\") " pod="openshift-authentication/oauth-openshift-6499b46898-nlbgm" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.663536 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/83a578b1-1a74-4e27-af0b-348ed5c56605-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6499b46898-nlbgm\" (UID: \"83a578b1-1a74-4e27-af0b-348ed5c56605\") " pod="openshift-authentication/oauth-openshift-6499b46898-nlbgm" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.663566 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/83a578b1-1a74-4e27-af0b-348ed5c56605-v4-0-config-system-service-ca\") pod \"oauth-openshift-6499b46898-nlbgm\" (UID: \"83a578b1-1a74-4e27-af0b-348ed5c56605\") " pod="openshift-authentication/oauth-openshift-6499b46898-nlbgm" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.663615 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/83a578b1-1a74-4e27-af0b-348ed5c56605-audit-dir\") pod \"oauth-openshift-6499b46898-nlbgm\" (UID: \"83a578b1-1a74-4e27-af0b-348ed5c56605\") " pod="openshift-authentication/oauth-openshift-6499b46898-nlbgm" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.663644 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/83a578b1-1a74-4e27-af0b-348ed5c56605-v4-0-config-system-session\") pod \"oauth-openshift-6499b46898-nlbgm\" (UID: \"83a578b1-1a74-4e27-af0b-348ed5c56605\") " pod="openshift-authentication/oauth-openshift-6499b46898-nlbgm" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.663661 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/83a578b1-1a74-4e27-af0b-348ed5c56605-v4-0-config-user-template-login\") pod \"oauth-openshift-6499b46898-nlbgm\" (UID: \"83a578b1-1a74-4e27-af0b-348ed5c56605\") " pod="openshift-authentication/oauth-openshift-6499b46898-nlbgm" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.663683 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/83a578b1-1a74-4e27-af0b-348ed5c56605-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6499b46898-nlbgm\" (UID: \"83a578b1-1a74-4e27-af0b-348ed5c56605\") " pod="openshift-authentication/oauth-openshift-6499b46898-nlbgm" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.663699 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/83a578b1-1a74-4e27-af0b-348ed5c56605-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6499b46898-nlbgm\" (UID: \"83a578b1-1a74-4e27-af0b-348ed5c56605\") " pod="openshift-authentication/oauth-openshift-6499b46898-nlbgm" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.663725 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/83a578b1-1a74-4e27-af0b-348ed5c56605-audit-policies\") pod \"oauth-openshift-6499b46898-nlbgm\" (UID: \"83a578b1-1a74-4e27-af0b-348ed5c56605\") " pod="openshift-authentication/oauth-openshift-6499b46898-nlbgm" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.663750 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/83a578b1-1a74-4e27-af0b-348ed5c56605-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6499b46898-nlbgm\" (UID: \"83a578b1-1a74-4e27-af0b-348ed5c56605\") " pod="openshift-authentication/oauth-openshift-6499b46898-nlbgm" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.663772 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/83a578b1-1a74-4e27-af0b-348ed5c56605-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6499b46898-nlbgm\" (UID: \"83a578b1-1a74-4e27-af0b-348ed5c56605\") " pod="openshift-authentication/oauth-openshift-6499b46898-nlbgm" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.663787 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/83a578b1-1a74-4e27-af0b-348ed5c56605-audit-dir\") pod \"oauth-openshift-6499b46898-nlbgm\" (UID: \"83a578b1-1a74-4e27-af0b-348ed5c56605\") " pod="openshift-authentication/oauth-openshift-6499b46898-nlbgm" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.663812 4699 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.663881 4699 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.664100 4699 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.665175 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/83a578b1-1a74-4e27-af0b-348ed5c56605-audit-policies\") pod \"oauth-openshift-6499b46898-nlbgm\" (UID: \"83a578b1-1a74-4e27-af0b-348ed5c56605\") " pod="openshift-authentication/oauth-openshift-6499b46898-nlbgm" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.665550 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/83a578b1-1a74-4e27-af0b-348ed5c56605-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6499b46898-nlbgm\" (UID: \"83a578b1-1a74-4e27-af0b-348ed5c56605\") " pod="openshift-authentication/oauth-openshift-6499b46898-nlbgm" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.665658 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/83a578b1-1a74-4e27-af0b-348ed5c56605-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6499b46898-nlbgm\" (UID: \"83a578b1-1a74-4e27-af0b-348ed5c56605\") " pod="openshift-authentication/oauth-openshift-6499b46898-nlbgm" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.665819 4699 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.665850 4699 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.665871 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzwtk\" (UniqueName: \"kubernetes.io/projected/b3173c9e-b8ff-4407-bb12-660219ce7a55-kube-api-access-lzwtk\") on node \"crc\" DevicePath \"\"" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.665892 4699 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.665900 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/83a578b1-1a74-4e27-af0b-348ed5c56605-v4-0-config-system-service-ca\") pod \"oauth-openshift-6499b46898-nlbgm\" (UID: \"83a578b1-1a74-4e27-af0b-348ed5c56605\") " pod="openshift-authentication/oauth-openshift-6499b46898-nlbgm" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.665915 4699 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.665954 4699 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.665975 4699 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.665994 4699 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b3173c9e-b8ff-4407-bb12-660219ce7a55-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.668136 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/83a578b1-1a74-4e27-af0b-348ed5c56605-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6499b46898-nlbgm\" (UID: \"83a578b1-1a74-4e27-af0b-348ed5c56605\") " pod="openshift-authentication/oauth-openshift-6499b46898-nlbgm" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.668732 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/83a578b1-1a74-4e27-af0b-348ed5c56605-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6499b46898-nlbgm\" (UID: \"83a578b1-1a74-4e27-af0b-348ed5c56605\") " pod="openshift-authentication/oauth-openshift-6499b46898-nlbgm" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.668969 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/83a578b1-1a74-4e27-af0b-348ed5c56605-v4-0-config-system-session\") pod \"oauth-openshift-6499b46898-nlbgm\" (UID: \"83a578b1-1a74-4e27-af0b-348ed5c56605\") " pod="openshift-authentication/oauth-openshift-6499b46898-nlbgm" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.669067 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/83a578b1-1a74-4e27-af0b-348ed5c56605-v4-0-config-system-router-certs\") pod \"oauth-openshift-6499b46898-nlbgm\" (UID: \"83a578b1-1a74-4e27-af0b-348ed5c56605\") " pod="openshift-authentication/oauth-openshift-6499b46898-nlbgm" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.669564 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/83a578b1-1a74-4e27-af0b-348ed5c56605-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6499b46898-nlbgm\" (UID: \"83a578b1-1a74-4e27-af0b-348ed5c56605\") " pod="openshift-authentication/oauth-openshift-6499b46898-nlbgm" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.669899 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/83a578b1-1a74-4e27-af0b-348ed5c56605-v4-0-config-user-template-error\") pod \"oauth-openshift-6499b46898-nlbgm\" (UID: \"83a578b1-1a74-4e27-af0b-348ed5c56605\") " pod="openshift-authentication/oauth-openshift-6499b46898-nlbgm" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.670913 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/83a578b1-1a74-4e27-af0b-348ed5c56605-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6499b46898-nlbgm\" (UID: \"83a578b1-1a74-4e27-af0b-348ed5c56605\") " pod="openshift-authentication/oauth-openshift-6499b46898-nlbgm" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.673059 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/83a578b1-1a74-4e27-af0b-348ed5c56605-v4-0-config-user-template-login\") pod \"oauth-openshift-6499b46898-nlbgm\" (UID: \"83a578b1-1a74-4e27-af0b-348ed5c56605\") " pod="openshift-authentication/oauth-openshift-6499b46898-nlbgm" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.681064 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29phd\" (UniqueName: \"kubernetes.io/projected/83a578b1-1a74-4e27-af0b-348ed5c56605-kube-api-access-29phd\") pod \"oauth-openshift-6499b46898-nlbgm\" (UID: \"83a578b1-1a74-4e27-af0b-348ed5c56605\") " pod="openshift-authentication/oauth-openshift-6499b46898-nlbgm" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.765731 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6499b46898-nlbgm" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.867951 4699 generic.go:334] "Generic (PLEG): container finished" podID="b3173c9e-b8ff-4407-bb12-660219ce7a55" containerID="1835565d9fd290b7c44c539b1e119eccb5dd882b7419a7ae76fd5ff6f514a514" exitCode=0 Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.868200 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" event={"ID":"b3173c9e-b8ff-4407-bb12-660219ce7a55","Type":"ContainerDied","Data":"1835565d9fd290b7c44c539b1e119eccb5dd882b7419a7ae76fd5ff6f514a514"} Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.868260 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" event={"ID":"b3173c9e-b8ff-4407-bb12-660219ce7a55","Type":"ContainerDied","Data":"d9cb1e62be0b7f3a55fdf4d3c1dea04bf1262a6e610949ba35b8e78e86838527"} Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.868288 4699 scope.go:117] "RemoveContainer" containerID="1835565d9fd290b7c44c539b1e119eccb5dd882b7419a7ae76fd5ff6f514a514" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.868500 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-qc8mt" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.910829 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-qc8mt"] Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.913416 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-qc8mt"] Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.922381 4699 scope.go:117] "RemoveContainer" containerID="1835565d9fd290b7c44c539b1e119eccb5dd882b7419a7ae76fd5ff6f514a514" Nov 22 04:12:16 crc kubenswrapper[4699]: E1122 04:12:16.923024 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1835565d9fd290b7c44c539b1e119eccb5dd882b7419a7ae76fd5ff6f514a514\": container with ID starting with 1835565d9fd290b7c44c539b1e119eccb5dd882b7419a7ae76fd5ff6f514a514 not found: ID does not exist" containerID="1835565d9fd290b7c44c539b1e119eccb5dd882b7419a7ae76fd5ff6f514a514" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.923077 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1835565d9fd290b7c44c539b1e119eccb5dd882b7419a7ae76fd5ff6f514a514"} err="failed to get container status \"1835565d9fd290b7c44c539b1e119eccb5dd882b7419a7ae76fd5ff6f514a514\": rpc error: code = NotFound desc = could not find container \"1835565d9fd290b7c44c539b1e119eccb5dd882b7419a7ae76fd5ff6f514a514\": container with ID starting with 1835565d9fd290b7c44c539b1e119eccb5dd882b7419a7ae76fd5ff6f514a514 not found: ID does not exist" Nov 22 04:12:16 crc kubenswrapper[4699]: I1122 04:12:16.985628 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6499b46898-nlbgm"] Nov 22 04:12:16 crc kubenswrapper[4699]: W1122 04:12:16.996911 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83a578b1_1a74_4e27_af0b_348ed5c56605.slice/crio-853eb161ccd0269bc034968f746af6f65b7d5517cc4a6b63d05d20377aafc3e4 WatchSource:0}: Error finding container 853eb161ccd0269bc034968f746af6f65b7d5517cc4a6b63d05d20377aafc3e4: Status 404 returned error can't find the container with id 853eb161ccd0269bc034968f746af6f65b7d5517cc4a6b63d05d20377aafc3e4 Nov 22 04:12:17 crc kubenswrapper[4699]: I1122 04:12:17.456838 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3173c9e-b8ff-4407-bb12-660219ce7a55" path="/var/lib/kubelet/pods/b3173c9e-b8ff-4407-bb12-660219ce7a55/volumes" Nov 22 04:12:17 crc kubenswrapper[4699]: I1122 04:12:17.879861 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6499b46898-nlbgm" event={"ID":"83a578b1-1a74-4e27-af0b-348ed5c56605","Type":"ContainerStarted","Data":"2e1da1cfb4bad7209c8d33ba217817ea64faebc64a8dfdc654b8fec8f83af717"} Nov 22 04:12:17 crc kubenswrapper[4699]: I1122 04:12:17.880601 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6499b46898-nlbgm" event={"ID":"83a578b1-1a74-4e27-af0b-348ed5c56605","Type":"ContainerStarted","Data":"853eb161ccd0269bc034968f746af6f65b7d5517cc4a6b63d05d20377aafc3e4"} Nov 22 04:12:17 crc kubenswrapper[4699]: I1122 04:12:17.880656 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6499b46898-nlbgm" Nov 22 04:12:17 crc kubenswrapper[4699]: I1122 04:12:17.890066 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6499b46898-nlbgm" Nov 22 04:12:17 crc kubenswrapper[4699]: I1122 04:12:17.914014 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6499b46898-nlbgm" podStartSLOduration=27.913978943 podStartE2EDuration="27.913978943s" podCreationTimestamp="2025-11-22 04:11:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:12:17.907998961 +0000 UTC m=+289.250620158" watchObservedRunningTime="2025-11-22 04:12:17.913978943 +0000 UTC m=+289.256600160" Nov 22 04:13:38 crc kubenswrapper[4699]: I1122 04:13:38.726194 4699 patch_prober.go:28] interesting pod/machine-config-daemon-kjwnt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:13:38 crc kubenswrapper[4699]: I1122 04:13:38.727010 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:14:08 crc kubenswrapper[4699]: I1122 04:14:08.726677 4699 patch_prober.go:28] interesting pod/machine-config-daemon-kjwnt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:14:08 crc kubenswrapper[4699]: I1122 04:14:08.727497 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:14:38 crc kubenswrapper[4699]: I1122 04:14:38.726120 4699 patch_prober.go:28] interesting pod/machine-config-daemon-kjwnt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:14:38 crc kubenswrapper[4699]: I1122 04:14:38.727117 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:14:38 crc kubenswrapper[4699]: I1122 04:14:38.727199 4699 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" Nov 22 04:14:38 crc kubenswrapper[4699]: I1122 04:14:38.728204 4699 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b84dc855d87746ccb34a8ac352c10879b9e75beb43a499be92456accaff795b4"} pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 04:14:38 crc kubenswrapper[4699]: I1122 04:14:38.728305 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerName="machine-config-daemon" containerID="cri-o://b84dc855d87746ccb34a8ac352c10879b9e75beb43a499be92456accaff795b4" gracePeriod=600 Nov 22 04:14:39 crc kubenswrapper[4699]: I1122 04:14:39.768711 4699 generic.go:334] "Generic (PLEG): container finished" podID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerID="b84dc855d87746ccb34a8ac352c10879b9e75beb43a499be92456accaff795b4" exitCode=0 Nov 22 04:14:39 crc kubenswrapper[4699]: I1122 04:14:39.768821 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" event={"ID":"41bdbae2-706a-4f84-9f56-5a42aec77762","Type":"ContainerDied","Data":"b84dc855d87746ccb34a8ac352c10879b9e75beb43a499be92456accaff795b4"} Nov 22 04:14:39 crc kubenswrapper[4699]: I1122 04:14:39.769332 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" event={"ID":"41bdbae2-706a-4f84-9f56-5a42aec77762","Type":"ContainerStarted","Data":"11c39a836dc5f876dbc167373de2ea59c5626be5c7a823f2dfb4fec915dd5ecc"} Nov 22 04:14:39 crc kubenswrapper[4699]: I1122 04:14:39.769364 4699 scope.go:117] "RemoveContainer" containerID="191befb5ec1036276709a4720f3cd8c40d63d14818bed55c5fac998489233619" Nov 22 04:15:00 crc kubenswrapper[4699]: I1122 04:15:00.160913 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396415-4dg4x"] Nov 22 04:15:00 crc kubenswrapper[4699]: I1122 04:15:00.162500 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-4dg4x" Nov 22 04:15:00 crc kubenswrapper[4699]: I1122 04:15:00.165635 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 22 04:15:00 crc kubenswrapper[4699]: I1122 04:15:00.165851 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 22 04:15:00 crc kubenswrapper[4699]: I1122 04:15:00.167060 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396415-4dg4x"] Nov 22 04:15:00 crc kubenswrapper[4699]: I1122 04:15:00.271907 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cc8212cd-9f6e-467c-9340-5e7bd8fc2f13-secret-volume\") pod \"collect-profiles-29396415-4dg4x\" (UID: \"cc8212cd-9f6e-467c-9340-5e7bd8fc2f13\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-4dg4x" Nov 22 04:15:00 crc kubenswrapper[4699]: I1122 04:15:00.272050 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc8212cd-9f6e-467c-9340-5e7bd8fc2f13-config-volume\") pod \"collect-profiles-29396415-4dg4x\" (UID: \"cc8212cd-9f6e-467c-9340-5e7bd8fc2f13\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-4dg4x" Nov 22 04:15:00 crc kubenswrapper[4699]: I1122 04:15:00.272097 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbmkg\" (UniqueName: \"kubernetes.io/projected/cc8212cd-9f6e-467c-9340-5e7bd8fc2f13-kube-api-access-rbmkg\") pod \"collect-profiles-29396415-4dg4x\" (UID: \"cc8212cd-9f6e-467c-9340-5e7bd8fc2f13\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-4dg4x" Nov 22 04:15:00 crc kubenswrapper[4699]: I1122 04:15:00.373026 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc8212cd-9f6e-467c-9340-5e7bd8fc2f13-config-volume\") pod \"collect-profiles-29396415-4dg4x\" (UID: \"cc8212cd-9f6e-467c-9340-5e7bd8fc2f13\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-4dg4x" Nov 22 04:15:00 crc kubenswrapper[4699]: I1122 04:15:00.373126 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbmkg\" (UniqueName: \"kubernetes.io/projected/cc8212cd-9f6e-467c-9340-5e7bd8fc2f13-kube-api-access-rbmkg\") pod \"collect-profiles-29396415-4dg4x\" (UID: \"cc8212cd-9f6e-467c-9340-5e7bd8fc2f13\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-4dg4x" Nov 22 04:15:00 crc kubenswrapper[4699]: I1122 04:15:00.373205 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cc8212cd-9f6e-467c-9340-5e7bd8fc2f13-secret-volume\") pod \"collect-profiles-29396415-4dg4x\" (UID: \"cc8212cd-9f6e-467c-9340-5e7bd8fc2f13\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-4dg4x" Nov 22 04:15:00 crc kubenswrapper[4699]: I1122 04:15:00.375429 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc8212cd-9f6e-467c-9340-5e7bd8fc2f13-config-volume\") pod \"collect-profiles-29396415-4dg4x\" (UID: \"cc8212cd-9f6e-467c-9340-5e7bd8fc2f13\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-4dg4x" Nov 22 04:15:00 crc kubenswrapper[4699]: I1122 04:15:00.382093 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cc8212cd-9f6e-467c-9340-5e7bd8fc2f13-secret-volume\") pod \"collect-profiles-29396415-4dg4x\" (UID: \"cc8212cd-9f6e-467c-9340-5e7bd8fc2f13\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-4dg4x" Nov 22 04:15:00 crc kubenswrapper[4699]: I1122 04:15:00.400951 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbmkg\" (UniqueName: \"kubernetes.io/projected/cc8212cd-9f6e-467c-9340-5e7bd8fc2f13-kube-api-access-rbmkg\") pod \"collect-profiles-29396415-4dg4x\" (UID: \"cc8212cd-9f6e-467c-9340-5e7bd8fc2f13\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-4dg4x" Nov 22 04:15:00 crc kubenswrapper[4699]: I1122 04:15:00.489941 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-4dg4x" Nov 22 04:15:00 crc kubenswrapper[4699]: I1122 04:15:00.694077 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396415-4dg4x"] Nov 22 04:15:00 crc kubenswrapper[4699]: I1122 04:15:00.912991 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-4dg4x" event={"ID":"cc8212cd-9f6e-467c-9340-5e7bd8fc2f13","Type":"ContainerStarted","Data":"654fbd71a7e7ede5de3a7f02e0c3d927b17b69ec6be4a4356a7ceeff74e230cd"} Nov 22 04:15:01 crc kubenswrapper[4699]: I1122 04:15:01.931713 4699 generic.go:334] "Generic (PLEG): container finished" podID="cc8212cd-9f6e-467c-9340-5e7bd8fc2f13" containerID="46b792c56e2773fb05e7bec23fb7d6241c50af8b8c54cbf9629cbd989ad8c0cb" exitCode=0 Nov 22 04:15:01 crc kubenswrapper[4699]: I1122 04:15:01.932029 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-4dg4x" event={"ID":"cc8212cd-9f6e-467c-9340-5e7bd8fc2f13","Type":"ContainerDied","Data":"46b792c56e2773fb05e7bec23fb7d6241c50af8b8c54cbf9629cbd989ad8c0cb"} Nov 22 04:15:03 crc kubenswrapper[4699]: I1122 04:15:03.157918 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-4dg4x" Nov 22 04:15:03 crc kubenswrapper[4699]: I1122 04:15:03.315303 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc8212cd-9f6e-467c-9340-5e7bd8fc2f13-config-volume\") pod \"cc8212cd-9f6e-467c-9340-5e7bd8fc2f13\" (UID: \"cc8212cd-9f6e-467c-9340-5e7bd8fc2f13\") " Nov 22 04:15:03 crc kubenswrapper[4699]: I1122 04:15:03.315862 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbmkg\" (UniqueName: \"kubernetes.io/projected/cc8212cd-9f6e-467c-9340-5e7bd8fc2f13-kube-api-access-rbmkg\") pod \"cc8212cd-9f6e-467c-9340-5e7bd8fc2f13\" (UID: \"cc8212cd-9f6e-467c-9340-5e7bd8fc2f13\") " Nov 22 04:15:03 crc kubenswrapper[4699]: I1122 04:15:03.315920 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cc8212cd-9f6e-467c-9340-5e7bd8fc2f13-secret-volume\") pod \"cc8212cd-9f6e-467c-9340-5e7bd8fc2f13\" (UID: \"cc8212cd-9f6e-467c-9340-5e7bd8fc2f13\") " Nov 22 04:15:03 crc kubenswrapper[4699]: I1122 04:15:03.316262 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc8212cd-9f6e-467c-9340-5e7bd8fc2f13-config-volume" (OuterVolumeSpecName: "config-volume") pod "cc8212cd-9f6e-467c-9340-5e7bd8fc2f13" (UID: "cc8212cd-9f6e-467c-9340-5e7bd8fc2f13"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:15:03 crc kubenswrapper[4699]: I1122 04:15:03.332283 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc8212cd-9f6e-467c-9340-5e7bd8fc2f13-kube-api-access-rbmkg" (OuterVolumeSpecName: "kube-api-access-rbmkg") pod "cc8212cd-9f6e-467c-9340-5e7bd8fc2f13" (UID: "cc8212cd-9f6e-467c-9340-5e7bd8fc2f13"). InnerVolumeSpecName "kube-api-access-rbmkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:15:03 crc kubenswrapper[4699]: I1122 04:15:03.332276 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc8212cd-9f6e-467c-9340-5e7bd8fc2f13-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "cc8212cd-9f6e-467c-9340-5e7bd8fc2f13" (UID: "cc8212cd-9f6e-467c-9340-5e7bd8fc2f13"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:15:03 crc kubenswrapper[4699]: I1122 04:15:03.417175 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbmkg\" (UniqueName: \"kubernetes.io/projected/cc8212cd-9f6e-467c-9340-5e7bd8fc2f13-kube-api-access-rbmkg\") on node \"crc\" DevicePath \"\"" Nov 22 04:15:03 crc kubenswrapper[4699]: I1122 04:15:03.417219 4699 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cc8212cd-9f6e-467c-9340-5e7bd8fc2f13-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 22 04:15:03 crc kubenswrapper[4699]: I1122 04:15:03.417228 4699 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc8212cd-9f6e-467c-9340-5e7bd8fc2f13-config-volume\") on node \"crc\" DevicePath \"\"" Nov 22 04:15:03 crc kubenswrapper[4699]: I1122 04:15:03.946258 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-4dg4x" event={"ID":"cc8212cd-9f6e-467c-9340-5e7bd8fc2f13","Type":"ContainerDied","Data":"654fbd71a7e7ede5de3a7f02e0c3d927b17b69ec6be4a4356a7ceeff74e230cd"} Nov 22 04:15:03 crc kubenswrapper[4699]: I1122 04:15:03.946299 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="654fbd71a7e7ede5de3a7f02e0c3d927b17b69ec6be4a4356a7ceeff74e230cd" Nov 22 04:15:03 crc kubenswrapper[4699]: I1122 04:15:03.946385 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396415-4dg4x" Nov 22 04:15:08 crc kubenswrapper[4699]: I1122 04:15:08.676147 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-fx6ws"] Nov 22 04:15:08 crc kubenswrapper[4699]: E1122 04:15:08.676869 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc8212cd-9f6e-467c-9340-5e7bd8fc2f13" containerName="collect-profiles" Nov 22 04:15:08 crc kubenswrapper[4699]: I1122 04:15:08.676883 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc8212cd-9f6e-467c-9340-5e7bd8fc2f13" containerName="collect-profiles" Nov 22 04:15:08 crc kubenswrapper[4699]: I1122 04:15:08.676987 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc8212cd-9f6e-467c-9340-5e7bd8fc2f13" containerName="collect-profiles" Nov 22 04:15:08 crc kubenswrapper[4699]: I1122 04:15:08.677432 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-fx6ws" Nov 22 04:15:08 crc kubenswrapper[4699]: I1122 04:15:08.693585 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-fx6ws"] Nov 22 04:15:08 crc kubenswrapper[4699]: I1122 04:15:08.791018 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1baea46b-8cfe-4a8f-86cb-410fdb8b1032-installation-pull-secrets\") pod \"image-registry-66df7c8f76-fx6ws\" (UID: \"1baea46b-8cfe-4a8f-86cb-410fdb8b1032\") " pod="openshift-image-registry/image-registry-66df7c8f76-fx6ws" Nov 22 04:15:08 crc kubenswrapper[4699]: I1122 04:15:08.791079 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1baea46b-8cfe-4a8f-86cb-410fdb8b1032-registry-tls\") pod \"image-registry-66df7c8f76-fx6ws\" (UID: \"1baea46b-8cfe-4a8f-86cb-410fdb8b1032\") " pod="openshift-image-registry/image-registry-66df7c8f76-fx6ws" Nov 22 04:15:08 crc kubenswrapper[4699]: I1122 04:15:08.791136 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1baea46b-8cfe-4a8f-86cb-410fdb8b1032-registry-certificates\") pod \"image-registry-66df7c8f76-fx6ws\" (UID: \"1baea46b-8cfe-4a8f-86cb-410fdb8b1032\") " pod="openshift-image-registry/image-registry-66df7c8f76-fx6ws" Nov 22 04:15:08 crc kubenswrapper[4699]: I1122 04:15:08.791179 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1baea46b-8cfe-4a8f-86cb-410fdb8b1032-ca-trust-extracted\") pod \"image-registry-66df7c8f76-fx6ws\" (UID: \"1baea46b-8cfe-4a8f-86cb-410fdb8b1032\") " pod="openshift-image-registry/image-registry-66df7c8f76-fx6ws" Nov 22 04:15:08 crc kubenswrapper[4699]: I1122 04:15:08.791201 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw5nv\" (UniqueName: \"kubernetes.io/projected/1baea46b-8cfe-4a8f-86cb-410fdb8b1032-kube-api-access-pw5nv\") pod \"image-registry-66df7c8f76-fx6ws\" (UID: \"1baea46b-8cfe-4a8f-86cb-410fdb8b1032\") " pod="openshift-image-registry/image-registry-66df7c8f76-fx6ws" Nov 22 04:15:08 crc kubenswrapper[4699]: I1122 04:15:08.791236 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-fx6ws\" (UID: \"1baea46b-8cfe-4a8f-86cb-410fdb8b1032\") " pod="openshift-image-registry/image-registry-66df7c8f76-fx6ws" Nov 22 04:15:08 crc kubenswrapper[4699]: I1122 04:15:08.791303 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1baea46b-8cfe-4a8f-86cb-410fdb8b1032-trusted-ca\") pod \"image-registry-66df7c8f76-fx6ws\" (UID: \"1baea46b-8cfe-4a8f-86cb-410fdb8b1032\") " pod="openshift-image-registry/image-registry-66df7c8f76-fx6ws" Nov 22 04:15:08 crc kubenswrapper[4699]: I1122 04:15:08.791385 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1baea46b-8cfe-4a8f-86cb-410fdb8b1032-bound-sa-token\") pod \"image-registry-66df7c8f76-fx6ws\" (UID: \"1baea46b-8cfe-4a8f-86cb-410fdb8b1032\") " pod="openshift-image-registry/image-registry-66df7c8f76-fx6ws" Nov 22 04:15:08 crc kubenswrapper[4699]: I1122 04:15:08.820010 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-fx6ws\" (UID: \"1baea46b-8cfe-4a8f-86cb-410fdb8b1032\") " pod="openshift-image-registry/image-registry-66df7c8f76-fx6ws" Nov 22 04:15:08 crc kubenswrapper[4699]: I1122 04:15:08.892715 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1baea46b-8cfe-4a8f-86cb-410fdb8b1032-trusted-ca\") pod \"image-registry-66df7c8f76-fx6ws\" (UID: \"1baea46b-8cfe-4a8f-86cb-410fdb8b1032\") " pod="openshift-image-registry/image-registry-66df7c8f76-fx6ws" Nov 22 04:15:08 crc kubenswrapper[4699]: I1122 04:15:08.892782 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1baea46b-8cfe-4a8f-86cb-410fdb8b1032-bound-sa-token\") pod \"image-registry-66df7c8f76-fx6ws\" (UID: \"1baea46b-8cfe-4a8f-86cb-410fdb8b1032\") " pod="openshift-image-registry/image-registry-66df7c8f76-fx6ws" Nov 22 04:15:08 crc kubenswrapper[4699]: I1122 04:15:08.892825 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1baea46b-8cfe-4a8f-86cb-410fdb8b1032-installation-pull-secrets\") pod \"image-registry-66df7c8f76-fx6ws\" (UID: \"1baea46b-8cfe-4a8f-86cb-410fdb8b1032\") " pod="openshift-image-registry/image-registry-66df7c8f76-fx6ws" Nov 22 04:15:08 crc kubenswrapper[4699]: I1122 04:15:08.892859 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1baea46b-8cfe-4a8f-86cb-410fdb8b1032-registry-tls\") pod \"image-registry-66df7c8f76-fx6ws\" (UID: \"1baea46b-8cfe-4a8f-86cb-410fdb8b1032\") " pod="openshift-image-registry/image-registry-66df7c8f76-fx6ws" Nov 22 04:15:08 crc kubenswrapper[4699]: I1122 04:15:08.892888 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1baea46b-8cfe-4a8f-86cb-410fdb8b1032-registry-certificates\") pod \"image-registry-66df7c8f76-fx6ws\" (UID: \"1baea46b-8cfe-4a8f-86cb-410fdb8b1032\") " pod="openshift-image-registry/image-registry-66df7c8f76-fx6ws" Nov 22 04:15:08 crc kubenswrapper[4699]: I1122 04:15:08.892922 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1baea46b-8cfe-4a8f-86cb-410fdb8b1032-ca-trust-extracted\") pod \"image-registry-66df7c8f76-fx6ws\" (UID: \"1baea46b-8cfe-4a8f-86cb-410fdb8b1032\") " pod="openshift-image-registry/image-registry-66df7c8f76-fx6ws" Nov 22 04:15:08 crc kubenswrapper[4699]: I1122 04:15:08.892940 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw5nv\" (UniqueName: \"kubernetes.io/projected/1baea46b-8cfe-4a8f-86cb-410fdb8b1032-kube-api-access-pw5nv\") pod \"image-registry-66df7c8f76-fx6ws\" (UID: \"1baea46b-8cfe-4a8f-86cb-410fdb8b1032\") " pod="openshift-image-registry/image-registry-66df7c8f76-fx6ws" Nov 22 04:15:08 crc kubenswrapper[4699]: I1122 04:15:08.893552 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1baea46b-8cfe-4a8f-86cb-410fdb8b1032-ca-trust-extracted\") pod \"image-registry-66df7c8f76-fx6ws\" (UID: \"1baea46b-8cfe-4a8f-86cb-410fdb8b1032\") " pod="openshift-image-registry/image-registry-66df7c8f76-fx6ws" Nov 22 04:15:08 crc kubenswrapper[4699]: I1122 04:15:08.894480 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1baea46b-8cfe-4a8f-86cb-410fdb8b1032-trusted-ca\") pod \"image-registry-66df7c8f76-fx6ws\" (UID: \"1baea46b-8cfe-4a8f-86cb-410fdb8b1032\") " pod="openshift-image-registry/image-registry-66df7c8f76-fx6ws" Nov 22 04:15:08 crc kubenswrapper[4699]: I1122 04:15:08.894677 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1baea46b-8cfe-4a8f-86cb-410fdb8b1032-registry-certificates\") pod \"image-registry-66df7c8f76-fx6ws\" (UID: \"1baea46b-8cfe-4a8f-86cb-410fdb8b1032\") " pod="openshift-image-registry/image-registry-66df7c8f76-fx6ws" Nov 22 04:15:08 crc kubenswrapper[4699]: I1122 04:15:08.900881 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1baea46b-8cfe-4a8f-86cb-410fdb8b1032-registry-tls\") pod \"image-registry-66df7c8f76-fx6ws\" (UID: \"1baea46b-8cfe-4a8f-86cb-410fdb8b1032\") " pod="openshift-image-registry/image-registry-66df7c8f76-fx6ws" Nov 22 04:15:08 crc kubenswrapper[4699]: I1122 04:15:08.901811 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1baea46b-8cfe-4a8f-86cb-410fdb8b1032-installation-pull-secrets\") pod \"image-registry-66df7c8f76-fx6ws\" (UID: \"1baea46b-8cfe-4a8f-86cb-410fdb8b1032\") " pod="openshift-image-registry/image-registry-66df7c8f76-fx6ws" Nov 22 04:15:08 crc kubenswrapper[4699]: I1122 04:15:08.910435 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1baea46b-8cfe-4a8f-86cb-410fdb8b1032-bound-sa-token\") pod \"image-registry-66df7c8f76-fx6ws\" (UID: \"1baea46b-8cfe-4a8f-86cb-410fdb8b1032\") " pod="openshift-image-registry/image-registry-66df7c8f76-fx6ws" Nov 22 04:15:08 crc kubenswrapper[4699]: I1122 04:15:08.913999 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw5nv\" (UniqueName: \"kubernetes.io/projected/1baea46b-8cfe-4a8f-86cb-410fdb8b1032-kube-api-access-pw5nv\") pod \"image-registry-66df7c8f76-fx6ws\" (UID: \"1baea46b-8cfe-4a8f-86cb-410fdb8b1032\") " pod="openshift-image-registry/image-registry-66df7c8f76-fx6ws" Nov 22 04:15:08 crc kubenswrapper[4699]: I1122 04:15:08.996906 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-fx6ws" Nov 22 04:15:09 crc kubenswrapper[4699]: I1122 04:15:09.206515 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-fx6ws"] Nov 22 04:15:09 crc kubenswrapper[4699]: W1122 04:15:09.219762 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1baea46b_8cfe_4a8f_86cb_410fdb8b1032.slice/crio-e7efbc0fc22f64dfa19a8a8a6ca8beaba92698f39d7d8338f4ba5f2fb34b1e6b WatchSource:0}: Error finding container e7efbc0fc22f64dfa19a8a8a6ca8beaba92698f39d7d8338f4ba5f2fb34b1e6b: Status 404 returned error can't find the container with id e7efbc0fc22f64dfa19a8a8a6ca8beaba92698f39d7d8338f4ba5f2fb34b1e6b Nov 22 04:15:09 crc kubenswrapper[4699]: I1122 04:15:09.982476 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-fx6ws" event={"ID":"1baea46b-8cfe-4a8f-86cb-410fdb8b1032","Type":"ContainerStarted","Data":"c0d50206d500b15419cbaf6e17b8841a84345ab1659da38f6c775938f50e1753"} Nov 22 04:15:09 crc kubenswrapper[4699]: I1122 04:15:09.982847 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-fx6ws" event={"ID":"1baea46b-8cfe-4a8f-86cb-410fdb8b1032","Type":"ContainerStarted","Data":"e7efbc0fc22f64dfa19a8a8a6ca8beaba92698f39d7d8338f4ba5f2fb34b1e6b"} Nov 22 04:15:10 crc kubenswrapper[4699]: I1122 04:15:10.989492 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-fx6ws" Nov 22 04:15:11 crc kubenswrapper[4699]: I1122 04:15:11.015589 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-fx6ws" podStartSLOduration=3.015570218 podStartE2EDuration="3.015570218s" podCreationTimestamp="2025-11-22 04:15:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:15:11.0090355 +0000 UTC m=+462.351656697" watchObservedRunningTime="2025-11-22 04:15:11.015570218 +0000 UTC m=+462.358191405" Nov 22 04:15:29 crc kubenswrapper[4699]: I1122 04:15:29.036467 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-fx6ws" Nov 22 04:15:29 crc kubenswrapper[4699]: I1122 04:15:29.100991 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-q4f67"] Nov 22 04:15:54 crc kubenswrapper[4699]: I1122 04:15:54.147704 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" podUID="5712615f-2791-42fe-9a50-3dafe99495a0" containerName="registry" containerID="cri-o://1047d86cc6194d44702bb5e82baf7e0b82ed87e5aad8a657b7035f6fdc7bf068" gracePeriod=30 Nov 22 04:15:55 crc kubenswrapper[4699]: I1122 04:15:55.301093 4699 generic.go:334] "Generic (PLEG): container finished" podID="5712615f-2791-42fe-9a50-3dafe99495a0" containerID="1047d86cc6194d44702bb5e82baf7e0b82ed87e5aad8a657b7035f6fdc7bf068" exitCode=0 Nov 22 04:15:55 crc kubenswrapper[4699]: I1122 04:15:55.301175 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" event={"ID":"5712615f-2791-42fe-9a50-3dafe99495a0","Type":"ContainerDied","Data":"1047d86cc6194d44702bb5e82baf7e0b82ed87e5aad8a657b7035f6fdc7bf068"} Nov 22 04:15:55 crc kubenswrapper[4699]: I1122 04:15:55.830976 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:15:56 crc kubenswrapper[4699]: I1122 04:15:56.021418 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5712615f-2791-42fe-9a50-3dafe99495a0-installation-pull-secrets\") pod \"5712615f-2791-42fe-9a50-3dafe99495a0\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " Nov 22 04:15:56 crc kubenswrapper[4699]: I1122 04:15:56.021536 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5712615f-2791-42fe-9a50-3dafe99495a0-registry-tls\") pod \"5712615f-2791-42fe-9a50-3dafe99495a0\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " Nov 22 04:15:56 crc kubenswrapper[4699]: I1122 04:15:56.021567 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5712615f-2791-42fe-9a50-3dafe99495a0-ca-trust-extracted\") pod \"5712615f-2791-42fe-9a50-3dafe99495a0\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " Nov 22 04:15:56 crc kubenswrapper[4699]: I1122 04:15:56.021590 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7d4jd\" (UniqueName: \"kubernetes.io/projected/5712615f-2791-42fe-9a50-3dafe99495a0-kube-api-access-7d4jd\") pod \"5712615f-2791-42fe-9a50-3dafe99495a0\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " Nov 22 04:15:56 crc kubenswrapper[4699]: I1122 04:15:56.021931 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"5712615f-2791-42fe-9a50-3dafe99495a0\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " Nov 22 04:15:56 crc kubenswrapper[4699]: I1122 04:15:56.021968 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5712615f-2791-42fe-9a50-3dafe99495a0-registry-certificates\") pod \"5712615f-2791-42fe-9a50-3dafe99495a0\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " Nov 22 04:15:56 crc kubenswrapper[4699]: I1122 04:15:56.022072 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5712615f-2791-42fe-9a50-3dafe99495a0-trusted-ca\") pod \"5712615f-2791-42fe-9a50-3dafe99495a0\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " Nov 22 04:15:56 crc kubenswrapper[4699]: I1122 04:15:56.022105 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5712615f-2791-42fe-9a50-3dafe99495a0-bound-sa-token\") pod \"5712615f-2791-42fe-9a50-3dafe99495a0\" (UID: \"5712615f-2791-42fe-9a50-3dafe99495a0\") " Nov 22 04:15:56 crc kubenswrapper[4699]: I1122 04:15:56.023571 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5712615f-2791-42fe-9a50-3dafe99495a0-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "5712615f-2791-42fe-9a50-3dafe99495a0" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:15:56 crc kubenswrapper[4699]: I1122 04:15:56.023655 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5712615f-2791-42fe-9a50-3dafe99495a0-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "5712615f-2791-42fe-9a50-3dafe99495a0" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:15:56 crc kubenswrapper[4699]: I1122 04:15:56.031484 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5712615f-2791-42fe-9a50-3dafe99495a0-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "5712615f-2791-42fe-9a50-3dafe99495a0" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:15:56 crc kubenswrapper[4699]: I1122 04:15:56.031708 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5712615f-2791-42fe-9a50-3dafe99495a0-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "5712615f-2791-42fe-9a50-3dafe99495a0" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:15:56 crc kubenswrapper[4699]: I1122 04:15:56.031910 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5712615f-2791-42fe-9a50-3dafe99495a0-kube-api-access-7d4jd" (OuterVolumeSpecName: "kube-api-access-7d4jd") pod "5712615f-2791-42fe-9a50-3dafe99495a0" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0"). InnerVolumeSpecName "kube-api-access-7d4jd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:15:56 crc kubenswrapper[4699]: I1122 04:15:56.032141 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5712615f-2791-42fe-9a50-3dafe99495a0-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "5712615f-2791-42fe-9a50-3dafe99495a0" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:15:56 crc kubenswrapper[4699]: I1122 04:15:56.038667 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "5712615f-2791-42fe-9a50-3dafe99495a0" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 22 04:15:56 crc kubenswrapper[4699]: I1122 04:15:56.043755 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5712615f-2791-42fe-9a50-3dafe99495a0-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "5712615f-2791-42fe-9a50-3dafe99495a0" (UID: "5712615f-2791-42fe-9a50-3dafe99495a0"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:15:56 crc kubenswrapper[4699]: I1122 04:15:56.123685 4699 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5712615f-2791-42fe-9a50-3dafe99495a0-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 22 04:15:56 crc kubenswrapper[4699]: I1122 04:15:56.123723 4699 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5712615f-2791-42fe-9a50-3dafe99495a0-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 22 04:15:56 crc kubenswrapper[4699]: I1122 04:15:56.123734 4699 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5712615f-2791-42fe-9a50-3dafe99495a0-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 22 04:15:56 crc kubenswrapper[4699]: I1122 04:15:56.123743 4699 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5712615f-2791-42fe-9a50-3dafe99495a0-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 22 04:15:56 crc kubenswrapper[4699]: I1122 04:15:56.123752 4699 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5712615f-2791-42fe-9a50-3dafe99495a0-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 22 04:15:56 crc kubenswrapper[4699]: I1122 04:15:56.123760 4699 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5712615f-2791-42fe-9a50-3dafe99495a0-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 22 04:15:56 crc kubenswrapper[4699]: I1122 04:15:56.123769 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7d4jd\" (UniqueName: \"kubernetes.io/projected/5712615f-2791-42fe-9a50-3dafe99495a0-kube-api-access-7d4jd\") on node \"crc\" DevicePath \"\"" Nov 22 04:15:56 crc kubenswrapper[4699]: I1122 04:15:56.311418 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" event={"ID":"5712615f-2791-42fe-9a50-3dafe99495a0","Type":"ContainerDied","Data":"859a9bdf234e05f2bd5b36b9c57468d2d49d48e098c6727ac1da60961db49b88"} Nov 22 04:15:56 crc kubenswrapper[4699]: I1122 04:15:56.311606 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-q4f67" Nov 22 04:15:56 crc kubenswrapper[4699]: I1122 04:15:56.312021 4699 scope.go:117] "RemoveContainer" containerID="1047d86cc6194d44702bb5e82baf7e0b82ed87e5aad8a657b7035f6fdc7bf068" Nov 22 04:15:56 crc kubenswrapper[4699]: I1122 04:15:56.352078 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-q4f67"] Nov 22 04:15:56 crc kubenswrapper[4699]: I1122 04:15:56.356499 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-q4f67"] Nov 22 04:15:57 crc kubenswrapper[4699]: I1122 04:15:57.457330 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5712615f-2791-42fe-9a50-3dafe99495a0" path="/var/lib/kubelet/pods/5712615f-2791-42fe-9a50-3dafe99495a0/volumes" Nov 22 04:17:08 crc kubenswrapper[4699]: I1122 04:17:08.726400 4699 patch_prober.go:28] interesting pod/machine-config-daemon-kjwnt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:17:08 crc kubenswrapper[4699]: I1122 04:17:08.727237 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:17:38 crc kubenswrapper[4699]: I1122 04:17:38.726680 4699 patch_prober.go:28] interesting pod/machine-config-daemon-kjwnt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:17:38 crc kubenswrapper[4699]: I1122 04:17:38.727363 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:18:08 crc kubenswrapper[4699]: I1122 04:18:08.725940 4699 patch_prober.go:28] interesting pod/machine-config-daemon-kjwnt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:18:08 crc kubenswrapper[4699]: I1122 04:18:08.726901 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:18:08 crc kubenswrapper[4699]: I1122 04:18:08.726984 4699 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" Nov 22 04:18:08 crc kubenswrapper[4699]: I1122 04:18:08.728124 4699 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"11c39a836dc5f876dbc167373de2ea59c5626be5c7a823f2dfb4fec915dd5ecc"} pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 04:18:08 crc kubenswrapper[4699]: I1122 04:18:08.728254 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerName="machine-config-daemon" containerID="cri-o://11c39a836dc5f876dbc167373de2ea59c5626be5c7a823f2dfb4fec915dd5ecc" gracePeriod=600 Nov 22 04:18:09 crc kubenswrapper[4699]: I1122 04:18:09.086007 4699 generic.go:334] "Generic (PLEG): container finished" podID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerID="11c39a836dc5f876dbc167373de2ea59c5626be5c7a823f2dfb4fec915dd5ecc" exitCode=0 Nov 22 04:18:09 crc kubenswrapper[4699]: I1122 04:18:09.086096 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" event={"ID":"41bdbae2-706a-4f84-9f56-5a42aec77762","Type":"ContainerDied","Data":"11c39a836dc5f876dbc167373de2ea59c5626be5c7a823f2dfb4fec915dd5ecc"} Nov 22 04:18:09 crc kubenswrapper[4699]: I1122 04:18:09.086178 4699 scope.go:117] "RemoveContainer" containerID="b84dc855d87746ccb34a8ac352c10879b9e75beb43a499be92456accaff795b4" Nov 22 04:18:10 crc kubenswrapper[4699]: I1122 04:18:10.095705 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" event={"ID":"41bdbae2-706a-4f84-9f56-5a42aec77762","Type":"ContainerStarted","Data":"199b50f4c2609410414bdb3fb89b173b5d648f7f42f86d10fd711b75ac95c283"} Nov 22 04:18:24 crc kubenswrapper[4699]: I1122 04:18:24.515773 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-mcllz"] Nov 22 04:18:24 crc kubenswrapper[4699]: E1122 04:18:24.516703 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5712615f-2791-42fe-9a50-3dafe99495a0" containerName="registry" Nov 22 04:18:24 crc kubenswrapper[4699]: I1122 04:18:24.516720 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="5712615f-2791-42fe-9a50-3dafe99495a0" containerName="registry" Nov 22 04:18:24 crc kubenswrapper[4699]: I1122 04:18:24.516859 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="5712615f-2791-42fe-9a50-3dafe99495a0" containerName="registry" Nov 22 04:18:24 crc kubenswrapper[4699]: I1122 04:18:24.517338 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-mcllz" Nov 22 04:18:24 crc kubenswrapper[4699]: I1122 04:18:24.519884 4699 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-lc9cx" Nov 22 04:18:24 crc kubenswrapper[4699]: I1122 04:18:24.519943 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Nov 22 04:18:24 crc kubenswrapper[4699]: I1122 04:18:24.519962 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Nov 22 04:18:24 crc kubenswrapper[4699]: I1122 04:18:24.526010 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-ggffg"] Nov 22 04:18:24 crc kubenswrapper[4699]: I1122 04:18:24.527013 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-ggffg" Nov 22 04:18:24 crc kubenswrapper[4699]: I1122 04:18:24.528787 4699 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-pxprd" Nov 22 04:18:24 crc kubenswrapper[4699]: I1122 04:18:24.531605 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-mcllz"] Nov 22 04:18:24 crc kubenswrapper[4699]: I1122 04:18:24.536120 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-ggffg"] Nov 22 04:18:24 crc kubenswrapper[4699]: I1122 04:18:24.550633 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-x8jj9"] Nov 22 04:18:24 crc kubenswrapper[4699]: I1122 04:18:24.551289 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-x8jj9" Nov 22 04:18:24 crc kubenswrapper[4699]: I1122 04:18:24.552867 4699 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-vg6s9" Nov 22 04:18:24 crc kubenswrapper[4699]: I1122 04:18:24.558808 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-x8jj9"] Nov 22 04:18:24 crc kubenswrapper[4699]: I1122 04:18:24.616563 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twlhs\" (UniqueName: \"kubernetes.io/projected/2620c9cc-4041-49f0-bd0a-2b227e8214d6-kube-api-access-twlhs\") pod \"cert-manager-5b446d88c5-ggffg\" (UID: \"2620c9cc-4041-49f0-bd0a-2b227e8214d6\") " pod="cert-manager/cert-manager-5b446d88c5-ggffg" Nov 22 04:18:24 crc kubenswrapper[4699]: I1122 04:18:24.616617 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knhlz\" (UniqueName: \"kubernetes.io/projected/69760a96-2f1a-4eca-8bc1-9734e255c260-kube-api-access-knhlz\") pod \"cert-manager-cainjector-7f985d654d-mcllz\" (UID: \"69760a96-2f1a-4eca-8bc1-9734e255c260\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-mcllz" Nov 22 04:18:24 crc kubenswrapper[4699]: I1122 04:18:24.717859 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twlhs\" (UniqueName: \"kubernetes.io/projected/2620c9cc-4041-49f0-bd0a-2b227e8214d6-kube-api-access-twlhs\") pod \"cert-manager-5b446d88c5-ggffg\" (UID: \"2620c9cc-4041-49f0-bd0a-2b227e8214d6\") " pod="cert-manager/cert-manager-5b446d88c5-ggffg" Nov 22 04:18:24 crc kubenswrapper[4699]: I1122 04:18:24.717923 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knhlz\" (UniqueName: \"kubernetes.io/projected/69760a96-2f1a-4eca-8bc1-9734e255c260-kube-api-access-knhlz\") pod \"cert-manager-cainjector-7f985d654d-mcllz\" (UID: \"69760a96-2f1a-4eca-8bc1-9734e255c260\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-mcllz" Nov 22 04:18:24 crc kubenswrapper[4699]: I1122 04:18:24.717988 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t97tz\" (UniqueName: \"kubernetes.io/projected/bdbcc3a3-05f4-4840-98d7-1e2417a9ad0b-kube-api-access-t97tz\") pod \"cert-manager-webhook-5655c58dd6-x8jj9\" (UID: \"bdbcc3a3-05f4-4840-98d7-1e2417a9ad0b\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-x8jj9" Nov 22 04:18:24 crc kubenswrapper[4699]: I1122 04:18:24.747283 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knhlz\" (UniqueName: \"kubernetes.io/projected/69760a96-2f1a-4eca-8bc1-9734e255c260-kube-api-access-knhlz\") pod \"cert-manager-cainjector-7f985d654d-mcllz\" (UID: \"69760a96-2f1a-4eca-8bc1-9734e255c260\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-mcllz" Nov 22 04:18:24 crc kubenswrapper[4699]: I1122 04:18:24.749596 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twlhs\" (UniqueName: \"kubernetes.io/projected/2620c9cc-4041-49f0-bd0a-2b227e8214d6-kube-api-access-twlhs\") pod \"cert-manager-5b446d88c5-ggffg\" (UID: \"2620c9cc-4041-49f0-bd0a-2b227e8214d6\") " pod="cert-manager/cert-manager-5b446d88c5-ggffg" Nov 22 04:18:24 crc kubenswrapper[4699]: I1122 04:18:24.819291 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t97tz\" (UniqueName: \"kubernetes.io/projected/bdbcc3a3-05f4-4840-98d7-1e2417a9ad0b-kube-api-access-t97tz\") pod \"cert-manager-webhook-5655c58dd6-x8jj9\" (UID: \"bdbcc3a3-05f4-4840-98d7-1e2417a9ad0b\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-x8jj9" Nov 22 04:18:24 crc kubenswrapper[4699]: I1122 04:18:24.836072 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t97tz\" (UniqueName: \"kubernetes.io/projected/bdbcc3a3-05f4-4840-98d7-1e2417a9ad0b-kube-api-access-t97tz\") pod \"cert-manager-webhook-5655c58dd6-x8jj9\" (UID: \"bdbcc3a3-05f4-4840-98d7-1e2417a9ad0b\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-x8jj9" Nov 22 04:18:24 crc kubenswrapper[4699]: I1122 04:18:24.840744 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-mcllz" Nov 22 04:18:24 crc kubenswrapper[4699]: I1122 04:18:24.869519 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-ggffg" Nov 22 04:18:24 crc kubenswrapper[4699]: I1122 04:18:24.874541 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-x8jj9" Nov 22 04:18:25 crc kubenswrapper[4699]: I1122 04:18:25.086222 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-mcllz"] Nov 22 04:18:25 crc kubenswrapper[4699]: I1122 04:18:25.100616 4699 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 04:18:25 crc kubenswrapper[4699]: I1122 04:18:25.164191 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-x8jj9"] Nov 22 04:18:25 crc kubenswrapper[4699]: I1122 04:18:25.166774 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-ggffg"] Nov 22 04:18:25 crc kubenswrapper[4699]: I1122 04:18:25.193305 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-x8jj9" event={"ID":"bdbcc3a3-05f4-4840-98d7-1e2417a9ad0b","Type":"ContainerStarted","Data":"ac79ac651bb9b47037b836b2eacc014d7af1762b2126ac77bdaa072d510b8525"} Nov 22 04:18:25 crc kubenswrapper[4699]: I1122 04:18:25.194923 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-ggffg" event={"ID":"2620c9cc-4041-49f0-bd0a-2b227e8214d6","Type":"ContainerStarted","Data":"08bdf220f2aa9ad1e09912e43a36755c609636d0c593b5e21b61ca8c811bca64"} Nov 22 04:18:25 crc kubenswrapper[4699]: I1122 04:18:25.197423 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-mcllz" event={"ID":"69760a96-2f1a-4eca-8bc1-9734e255c260","Type":"ContainerStarted","Data":"9e00e4d096c800c68041973b0e6f8322cfa0d95ca37b2406d7bacc10d5ca7b3a"} Nov 22 04:18:29 crc kubenswrapper[4699]: I1122 04:18:29.234357 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-ggffg" event={"ID":"2620c9cc-4041-49f0-bd0a-2b227e8214d6","Type":"ContainerStarted","Data":"f8dbb1535e674888c37d5a957181bf742e6a5a566b01885acfcffae84490d68b"} Nov 22 04:18:29 crc kubenswrapper[4699]: I1122 04:18:29.237034 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-mcllz" event={"ID":"69760a96-2f1a-4eca-8bc1-9734e255c260","Type":"ContainerStarted","Data":"dd7e43895e3e35855b87831545c3fa93746768440d9745a63d98a140ba3a7626"} Nov 22 04:18:29 crc kubenswrapper[4699]: I1122 04:18:29.238734 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-x8jj9" event={"ID":"bdbcc3a3-05f4-4840-98d7-1e2417a9ad0b","Type":"ContainerStarted","Data":"32a03bb999d19d3f0687a5ad9c62fffeecab75d60a4cd43b478ad3bdac8d12bf"} Nov 22 04:18:29 crc kubenswrapper[4699]: I1122 04:18:29.238869 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-x8jj9" Nov 22 04:18:29 crc kubenswrapper[4699]: I1122 04:18:29.250566 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-ggffg" podStartSLOduration=1.832637761 podStartE2EDuration="5.250547219s" podCreationTimestamp="2025-11-22 04:18:24 +0000 UTC" firstStartedPulling="2025-11-22 04:18:25.176645619 +0000 UTC m=+656.519266796" lastFinishedPulling="2025-11-22 04:18:28.594555067 +0000 UTC m=+659.937176254" observedRunningTime="2025-11-22 04:18:29.248008796 +0000 UTC m=+660.590629983" watchObservedRunningTime="2025-11-22 04:18:29.250547219 +0000 UTC m=+660.593168406" Nov 22 04:18:29 crc kubenswrapper[4699]: I1122 04:18:29.266710 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-mcllz" podStartSLOduration=1.767767447 podStartE2EDuration="5.266689365s" podCreationTimestamp="2025-11-22 04:18:24 +0000 UTC" firstStartedPulling="2025-11-22 04:18:25.100240132 +0000 UTC m=+656.442861319" lastFinishedPulling="2025-11-22 04:18:28.59916205 +0000 UTC m=+659.941783237" observedRunningTime="2025-11-22 04:18:29.263460186 +0000 UTC m=+660.606081383" watchObservedRunningTime="2025-11-22 04:18:29.266689365 +0000 UTC m=+660.609310552" Nov 22 04:18:29 crc kubenswrapper[4699]: I1122 04:18:29.279760 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-x8jj9" podStartSLOduration=1.849528366 podStartE2EDuration="5.279743766s" podCreationTimestamp="2025-11-22 04:18:24 +0000 UTC" firstStartedPulling="2025-11-22 04:18:25.174423674 +0000 UTC m=+656.517044861" lastFinishedPulling="2025-11-22 04:18:28.604639074 +0000 UTC m=+659.947260261" observedRunningTime="2025-11-22 04:18:29.278097105 +0000 UTC m=+660.620718292" watchObservedRunningTime="2025-11-22 04:18:29.279743766 +0000 UTC m=+660.622364953" Nov 22 04:18:34 crc kubenswrapper[4699]: I1122 04:18:34.877669 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-x8jj9" Nov 22 04:18:52 crc kubenswrapper[4699]: I1122 04:18:52.766421 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-z7552"] Nov 22 04:18:52 crc kubenswrapper[4699]: I1122 04:18:52.768040 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerName="ovn-controller" containerID="cri-o://df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710" gracePeriod=30 Nov 22 04:18:52 crc kubenswrapper[4699]: I1122 04:18:52.768131 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerName="northd" containerID="cri-o://85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5" gracePeriod=30 Nov 22 04:18:52 crc kubenswrapper[4699]: I1122 04:18:52.768243 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e" gracePeriod=30 Nov 22 04:18:52 crc kubenswrapper[4699]: I1122 04:18:52.768335 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerName="kube-rbac-proxy-node" containerID="cri-o://823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d" gracePeriod=30 Nov 22 04:18:52 crc kubenswrapper[4699]: I1122 04:18:52.768388 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerName="ovn-acl-logging" containerID="cri-o://e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612" gracePeriod=30 Nov 22 04:18:52 crc kubenswrapper[4699]: I1122 04:18:52.768392 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerName="sbdb" containerID="cri-o://c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893" gracePeriod=30 Nov 22 04:18:52 crc kubenswrapper[4699]: I1122 04:18:52.768469 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerName="nbdb" containerID="cri-o://1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a" gracePeriod=30 Nov 22 04:18:52 crc kubenswrapper[4699]: I1122 04:18:52.842403 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerName="ovnkube-controller" containerID="cri-o://b673fbe125877625a87ad0ea5862032a63a472c737bac93e0e9be1c479112f3d" gracePeriod=30 Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.121705 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z7552_fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3/ovnkube-controller/3.log" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.123853 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z7552_fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3/ovn-acl-logging/0.log" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.124607 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z7552_fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3/ovn-controller/0.log" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.125026 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.173244 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-grlgt"] Nov 22 04:18:53 crc kubenswrapper[4699]: E1122 04:18:53.173572 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerName="ovn-acl-logging" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.173590 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerName="ovn-acl-logging" Nov 22 04:18:53 crc kubenswrapper[4699]: E1122 04:18:53.173602 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerName="ovnkube-controller" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.173615 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerName="ovnkube-controller" Nov 22 04:18:53 crc kubenswrapper[4699]: E1122 04:18:53.173628 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerName="kube-rbac-proxy-ovn-metrics" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.173635 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerName="kube-rbac-proxy-ovn-metrics" Nov 22 04:18:53 crc kubenswrapper[4699]: E1122 04:18:53.173646 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerName="ovn-controller" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.173653 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerName="ovn-controller" Nov 22 04:18:53 crc kubenswrapper[4699]: E1122 04:18:53.173665 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerName="kube-rbac-proxy-node" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.173672 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerName="kube-rbac-proxy-node" Nov 22 04:18:53 crc kubenswrapper[4699]: E1122 04:18:53.173681 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerName="ovnkube-controller" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.173687 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerName="ovnkube-controller" Nov 22 04:18:53 crc kubenswrapper[4699]: E1122 04:18:53.173694 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerName="nbdb" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.173701 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerName="nbdb" Nov 22 04:18:53 crc kubenswrapper[4699]: E1122 04:18:53.173710 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerName="sbdb" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.173716 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerName="sbdb" Nov 22 04:18:53 crc kubenswrapper[4699]: E1122 04:18:53.173728 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerName="northd" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.173736 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerName="northd" Nov 22 04:18:53 crc kubenswrapper[4699]: E1122 04:18:53.173746 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerName="kubecfg-setup" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.173753 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerName="kubecfg-setup" Nov 22 04:18:53 crc kubenswrapper[4699]: E1122 04:18:53.173763 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerName="ovnkube-controller" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.173772 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerName="ovnkube-controller" Nov 22 04:18:53 crc kubenswrapper[4699]: E1122 04:18:53.173781 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerName="ovnkube-controller" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.173788 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerName="ovnkube-controller" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.173894 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerName="ovnkube-controller" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.173906 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerName="sbdb" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.173915 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerName="ovnkube-controller" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.173923 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerName="ovn-controller" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.173934 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerName="kube-rbac-proxy-ovn-metrics" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.173943 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerName="ovnkube-controller" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.173950 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerName="ovn-acl-logging" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.173958 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerName="ovnkube-controller" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.173969 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerName="kube-rbac-proxy-node" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.173982 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerName="nbdb" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.173993 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerName="northd" Nov 22 04:18:53 crc kubenswrapper[4699]: E1122 04:18:53.174131 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerName="ovnkube-controller" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.174140 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerName="ovnkube-controller" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.174242 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerName="ovnkube-controller" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.176163 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.287761 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-km2cd\" (UniqueName: \"kubernetes.io/projected/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-kube-api-access-km2cd\") pod \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.287826 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-env-overrides\") pod \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.287852 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-host-slash\") pod \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.287878 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-var-lib-openvswitch\") pod \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.287894 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-host-kubelet\") pod \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.287910 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.287934 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-host-run-netns\") pod \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.287955 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-ovnkube-config\") pod \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.287970 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-run-openvswitch\") pod \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.288001 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-node-log\") pod \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.288267 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-run-systemd\") pod \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.288297 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-etc-openvswitch\") pod \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.288314 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-log-socket\") pod \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.288356 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-host-cni-bin\") pod \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.288369 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-host-run-ovn-kubernetes\") pod \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.288394 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-run-ovn\") pod \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.288407 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-host-cni-netd\") pod \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.288421 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-systemd-units\") pod \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.288455 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-ovn-node-metrics-cert\") pod \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.288475 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-ovnkube-script-lib\") pod \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\" (UID: \"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3\") " Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.288632 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2kf4\" (UniqueName: \"kubernetes.io/projected/f81deb4d-348a-4b1b-9506-beff439c180b-kube-api-access-l2kf4\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.288644 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-node-log" (OuterVolumeSpecName: "node-log") pod "fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" (UID: "fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.288671 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f81deb4d-348a-4b1b-9506-beff439c180b-env-overrides\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.288709 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" (UID: "fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.288815 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f81deb4d-348a-4b1b-9506-beff439c180b-run-ovn\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.288802 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" (UID: "fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.288879 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f81deb4d-348a-4b1b-9506-beff439c180b-ovn-node-metrics-cert\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.288916 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f81deb4d-348a-4b1b-9506-beff439c180b-host-slash\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.288964 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f81deb4d-348a-4b1b-9506-beff439c180b-ovnkube-script-lib\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.289010 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f81deb4d-348a-4b1b-9506-beff439c180b-host-kubelet\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.289047 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f81deb4d-348a-4b1b-9506-beff439c180b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.289091 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f81deb4d-348a-4b1b-9506-beff439c180b-systemd-units\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.289122 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f81deb4d-348a-4b1b-9506-beff439c180b-run-openvswitch\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.289163 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f81deb4d-348a-4b1b-9506-beff439c180b-host-run-ovn-kubernetes\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.289195 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f81deb4d-348a-4b1b-9506-beff439c180b-ovnkube-config\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.289224 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f81deb4d-348a-4b1b-9506-beff439c180b-host-run-netns\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.289271 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f81deb4d-348a-4b1b-9506-beff439c180b-node-log\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.289324 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f81deb4d-348a-4b1b-9506-beff439c180b-host-cni-netd\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.289382 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f81deb4d-348a-4b1b-9506-beff439c180b-log-socket\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.289483 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f81deb4d-348a-4b1b-9506-beff439c180b-etc-openvswitch\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.289553 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f81deb4d-348a-4b1b-9506-beff439c180b-var-lib-openvswitch\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.288882 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" (UID: "fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.289513 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" (UID: "fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.289596 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" (UID: "fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.289651 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" (UID: "fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.289618 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" (UID: "fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.289674 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-log-socket" (OuterVolumeSpecName: "log-socket") pod "fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" (UID: "fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.289683 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" (UID: "fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.289695 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" (UID: "fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.289717 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" (UID: "fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.289727 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-host-slash" (OuterVolumeSpecName: "host-slash") pod "fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" (UID: "fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.289748 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" (UID: "fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.289745 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" (UID: "fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.289950 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f81deb4d-348a-4b1b-9506-beff439c180b-run-systemd\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.289980 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" (UID: "fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.290153 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f81deb4d-348a-4b1b-9506-beff439c180b-host-cni-bin\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.290253 4699 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-node-log\") on node \"crc\" DevicePath \"\"" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.290271 4699 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.290283 4699 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-log-socket\") on node \"crc\" DevicePath \"\"" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.290296 4699 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-host-cni-bin\") on node \"crc\" DevicePath \"\"" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.290307 4699 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.290319 4699 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.290331 4699 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-host-cni-netd\") on node \"crc\" DevicePath \"\"" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.290342 4699 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-systemd-units\") on node \"crc\" DevicePath \"\"" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.290354 4699 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.290366 4699 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-host-slash\") on node \"crc\" DevicePath \"\"" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.290375 4699 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.290385 4699 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-host-kubelet\") on node \"crc\" DevicePath \"\"" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.290396 4699 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.290407 4699 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-host-run-netns\") on node \"crc\" DevicePath \"\"" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.290418 4699 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.290448 4699 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-run-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.290681 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" (UID: "fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.293960 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-kube-api-access-km2cd" (OuterVolumeSpecName: "kube-api-access-km2cd") pod "fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" (UID: "fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3"). InnerVolumeSpecName "kube-api-access-km2cd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.294748 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" (UID: "fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.301905 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" (UID: "fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.375283 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pmtb4_c5f530d5-6f69-4838-a0dd-f4662ddbf85c/kube-multus/2.log" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.375997 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pmtb4_c5f530d5-6f69-4838-a0dd-f4662ddbf85c/kube-multus/1.log" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.376053 4699 generic.go:334] "Generic (PLEG): container finished" podID="c5f530d5-6f69-4838-a0dd-f4662ddbf85c" containerID="ffb362e6b86a26120532d834f084b64ff7f8e82585292b537741c72e7d426e3b" exitCode=2 Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.376164 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pmtb4" event={"ID":"c5f530d5-6f69-4838-a0dd-f4662ddbf85c","Type":"ContainerDied","Data":"ffb362e6b86a26120532d834f084b64ff7f8e82585292b537741c72e7d426e3b"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.376238 4699 scope.go:117] "RemoveContainer" containerID="b3db78d8652d86af236e2b210210af39f3c90f31425810390e79391e581d0cf9" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.376910 4699 scope.go:117] "RemoveContainer" containerID="ffb362e6b86a26120532d834f084b64ff7f8e82585292b537741c72e7d426e3b" Nov 22 04:18:53 crc kubenswrapper[4699]: E1122 04:18:53.377126 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-pmtb4_openshift-multus(c5f530d5-6f69-4838-a0dd-f4662ddbf85c)\"" pod="openshift-multus/multus-pmtb4" podUID="c5f530d5-6f69-4838-a0dd-f4662ddbf85c" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.378642 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z7552_fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3/ovnkube-controller/3.log" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.380993 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z7552_fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3/ovn-acl-logging/0.log" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.381503 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z7552_fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3/ovn-controller/0.log" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.381926 4699 generic.go:334] "Generic (PLEG): container finished" podID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerID="b673fbe125877625a87ad0ea5862032a63a472c737bac93e0e9be1c479112f3d" exitCode=0 Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.381983 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" event={"ID":"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3","Type":"ContainerDied","Data":"b673fbe125877625a87ad0ea5862032a63a472c737bac93e0e9be1c479112f3d"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382016 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382024 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" event={"ID":"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3","Type":"ContainerDied","Data":"c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.381955 4699 generic.go:334] "Generic (PLEG): container finished" podID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerID="c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893" exitCode=0 Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382075 4699 generic.go:334] "Generic (PLEG): container finished" podID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerID="1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a" exitCode=0 Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382085 4699 generic.go:334] "Generic (PLEG): container finished" podID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerID="85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5" exitCode=0 Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382094 4699 generic.go:334] "Generic (PLEG): container finished" podID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerID="ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e" exitCode=0 Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382104 4699 generic.go:334] "Generic (PLEG): container finished" podID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerID="823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d" exitCode=0 Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382134 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" event={"ID":"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3","Type":"ContainerDied","Data":"1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382154 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" event={"ID":"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3","Type":"ContainerDied","Data":"85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382170 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" event={"ID":"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3","Type":"ContainerDied","Data":"ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382185 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" event={"ID":"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3","Type":"ContainerDied","Data":"823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382202 4699 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b673fbe125877625a87ad0ea5862032a63a472c737bac93e0e9be1c479112f3d"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382219 4699 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bbb779ff19249c1428629a088a765868d3740d2e2ebbac18bdd170537da92af0"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382229 4699 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382236 4699 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382243 4699 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382250 4699 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382257 4699 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382265 4699 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382272 4699 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382279 4699 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382288 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" event={"ID":"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3","Type":"ContainerDied","Data":"e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382303 4699 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b673fbe125877625a87ad0ea5862032a63a472c737bac93e0e9be1c479112f3d"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382315 4699 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bbb779ff19249c1428629a088a765868d3740d2e2ebbac18bdd170537da92af0"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382322 4699 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382330 4699 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382337 4699 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382346 4699 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382353 4699 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382361 4699 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382368 4699 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382375 4699 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382391 4699 generic.go:334] "Generic (PLEG): container finished" podID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerID="e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612" exitCode=143 Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382412 4699 generic.go:334] "Generic (PLEG): container finished" podID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" containerID="df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710" exitCode=143 Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382459 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" event={"ID":"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3","Type":"ContainerDied","Data":"df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382474 4699 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b673fbe125877625a87ad0ea5862032a63a472c737bac93e0e9be1c479112f3d"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382486 4699 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bbb779ff19249c1428629a088a765868d3740d2e2ebbac18bdd170537da92af0"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382493 4699 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382502 4699 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382510 4699 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382517 4699 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382527 4699 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382534 4699 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382541 4699 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382550 4699 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382560 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z7552" event={"ID":"fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3","Type":"ContainerDied","Data":"1bce859ec6c521dfc40466f879cabfb7816b2238d18f4fbba72fbb2cd24fa9ec"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382572 4699 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b673fbe125877625a87ad0ea5862032a63a472c737bac93e0e9be1c479112f3d"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382580 4699 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bbb779ff19249c1428629a088a765868d3740d2e2ebbac18bdd170537da92af0"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382588 4699 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382595 4699 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382602 4699 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382609 4699 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382617 4699 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382624 4699 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382633 4699 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.382641 4699 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4"} Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.391399 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f81deb4d-348a-4b1b-9506-beff439c180b-host-slash\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.391470 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f81deb4d-348a-4b1b-9506-beff439c180b-ovnkube-script-lib\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.391495 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f81deb4d-348a-4b1b-9506-beff439c180b-host-kubelet\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.391512 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f81deb4d-348a-4b1b-9506-beff439c180b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.391514 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f81deb4d-348a-4b1b-9506-beff439c180b-host-slash\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.391538 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f81deb4d-348a-4b1b-9506-beff439c180b-systemd-units\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.391557 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f81deb4d-348a-4b1b-9506-beff439c180b-run-openvswitch\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.391567 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f81deb4d-348a-4b1b-9506-beff439c180b-host-kubelet\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.391576 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f81deb4d-348a-4b1b-9506-beff439c180b-ovnkube-config\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.391594 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f81deb4d-348a-4b1b-9506-beff439c180b-host-run-ovn-kubernetes\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.391611 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f81deb4d-348a-4b1b-9506-beff439c180b-host-run-netns\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.391617 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f81deb4d-348a-4b1b-9506-beff439c180b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.391636 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f81deb4d-348a-4b1b-9506-beff439c180b-node-log\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.391665 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f81deb4d-348a-4b1b-9506-beff439c180b-host-cni-netd\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.391690 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f81deb4d-348a-4b1b-9506-beff439c180b-log-socket\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.391715 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f81deb4d-348a-4b1b-9506-beff439c180b-etc-openvswitch\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.391742 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f81deb4d-348a-4b1b-9506-beff439c180b-var-lib-openvswitch\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.391757 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f81deb4d-348a-4b1b-9506-beff439c180b-run-systemd\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.391778 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f81deb4d-348a-4b1b-9506-beff439c180b-host-cni-bin\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.391797 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2kf4\" (UniqueName: \"kubernetes.io/projected/f81deb4d-348a-4b1b-9506-beff439c180b-kube-api-access-l2kf4\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.391813 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f81deb4d-348a-4b1b-9506-beff439c180b-env-overrides\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.391835 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f81deb4d-348a-4b1b-9506-beff439c180b-run-ovn\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.391853 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f81deb4d-348a-4b1b-9506-beff439c180b-ovn-node-metrics-cert\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.391904 4699 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.391923 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-km2cd\" (UniqueName: \"kubernetes.io/projected/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-kube-api-access-km2cd\") on node \"crc\" DevicePath \"\"" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.391596 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f81deb4d-348a-4b1b-9506-beff439c180b-systemd-units\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.392068 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f81deb4d-348a-4b1b-9506-beff439c180b-etc-openvswitch\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.392134 4699 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.392173 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f81deb4d-348a-4b1b-9506-beff439c180b-run-ovn\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.392318 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f81deb4d-348a-4b1b-9506-beff439c180b-node-log\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.392343 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f81deb4d-348a-4b1b-9506-beff439c180b-log-socket\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.392359 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f81deb4d-348a-4b1b-9506-beff439c180b-var-lib-openvswitch\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.392373 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f81deb4d-348a-4b1b-9506-beff439c180b-host-run-netns\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.392360 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f81deb4d-348a-4b1b-9506-beff439c180b-host-cni-netd\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.392385 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f81deb4d-348a-4b1b-9506-beff439c180b-host-run-ovn-kubernetes\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.392391 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f81deb4d-348a-4b1b-9506-beff439c180b-ovnkube-config\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.392397 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f81deb4d-348a-4b1b-9506-beff439c180b-run-systemd\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.392216 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f81deb4d-348a-4b1b-9506-beff439c180b-host-cni-bin\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.392862 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f81deb4d-348a-4b1b-9506-beff439c180b-ovnkube-script-lib\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.392953 4699 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3-run-systemd\") on node \"crc\" DevicePath \"\"" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.393472 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f81deb4d-348a-4b1b-9506-beff439c180b-env-overrides\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.392944 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f81deb4d-348a-4b1b-9506-beff439c180b-run-openvswitch\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.396182 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f81deb4d-348a-4b1b-9506-beff439c180b-ovn-node-metrics-cert\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.413306 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2kf4\" (UniqueName: \"kubernetes.io/projected/f81deb4d-348a-4b1b-9506-beff439c180b-kube-api-access-l2kf4\") pod \"ovnkube-node-grlgt\" (UID: \"f81deb4d-348a-4b1b-9506-beff439c180b\") " pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.414167 4699 scope.go:117] "RemoveContainer" containerID="b673fbe125877625a87ad0ea5862032a63a472c737bac93e0e9be1c479112f3d" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.423935 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-z7552"] Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.432599 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-z7552"] Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.444188 4699 scope.go:117] "RemoveContainer" containerID="bbb779ff19249c1428629a088a765868d3740d2e2ebbac18bdd170537da92af0" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.455904 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3" path="/var/lib/kubelet/pods/fa3d3ec8-1b76-4cc3-bfc0-60a9c6bc29f3/volumes" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.466392 4699 scope.go:117] "RemoveContainer" containerID="c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.480775 4699 scope.go:117] "RemoveContainer" containerID="1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.490453 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.494645 4699 scope.go:117] "RemoveContainer" containerID="85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.509307 4699 scope.go:117] "RemoveContainer" containerID="ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.525310 4699 scope.go:117] "RemoveContainer" containerID="823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.545399 4699 scope.go:117] "RemoveContainer" containerID="e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.562631 4699 scope.go:117] "RemoveContainer" containerID="df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.581899 4699 scope.go:117] "RemoveContainer" containerID="a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.598722 4699 scope.go:117] "RemoveContainer" containerID="b673fbe125877625a87ad0ea5862032a63a472c737bac93e0e9be1c479112f3d" Nov 22 04:18:53 crc kubenswrapper[4699]: E1122 04:18:53.599516 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b673fbe125877625a87ad0ea5862032a63a472c737bac93e0e9be1c479112f3d\": container with ID starting with b673fbe125877625a87ad0ea5862032a63a472c737bac93e0e9be1c479112f3d not found: ID does not exist" containerID="b673fbe125877625a87ad0ea5862032a63a472c737bac93e0e9be1c479112f3d" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.599745 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b673fbe125877625a87ad0ea5862032a63a472c737bac93e0e9be1c479112f3d"} err="failed to get container status \"b673fbe125877625a87ad0ea5862032a63a472c737bac93e0e9be1c479112f3d\": rpc error: code = NotFound desc = could not find container \"b673fbe125877625a87ad0ea5862032a63a472c737bac93e0e9be1c479112f3d\": container with ID starting with b673fbe125877625a87ad0ea5862032a63a472c737bac93e0e9be1c479112f3d not found: ID does not exist" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.600015 4699 scope.go:117] "RemoveContainer" containerID="bbb779ff19249c1428629a088a765868d3740d2e2ebbac18bdd170537da92af0" Nov 22 04:18:53 crc kubenswrapper[4699]: E1122 04:18:53.600940 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbb779ff19249c1428629a088a765868d3740d2e2ebbac18bdd170537da92af0\": container with ID starting with bbb779ff19249c1428629a088a765868d3740d2e2ebbac18bdd170537da92af0 not found: ID does not exist" containerID="bbb779ff19249c1428629a088a765868d3740d2e2ebbac18bdd170537da92af0" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.600994 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbb779ff19249c1428629a088a765868d3740d2e2ebbac18bdd170537da92af0"} err="failed to get container status \"bbb779ff19249c1428629a088a765868d3740d2e2ebbac18bdd170537da92af0\": rpc error: code = NotFound desc = could not find container \"bbb779ff19249c1428629a088a765868d3740d2e2ebbac18bdd170537da92af0\": container with ID starting with bbb779ff19249c1428629a088a765868d3740d2e2ebbac18bdd170537da92af0 not found: ID does not exist" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.601054 4699 scope.go:117] "RemoveContainer" containerID="c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893" Nov 22 04:18:53 crc kubenswrapper[4699]: E1122 04:18:53.601466 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893\": container with ID starting with c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893 not found: ID does not exist" containerID="c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.601507 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893"} err="failed to get container status \"c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893\": rpc error: code = NotFound desc = could not find container \"c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893\": container with ID starting with c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893 not found: ID does not exist" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.601530 4699 scope.go:117] "RemoveContainer" containerID="1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a" Nov 22 04:18:53 crc kubenswrapper[4699]: E1122 04:18:53.601762 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a\": container with ID starting with 1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a not found: ID does not exist" containerID="1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.601846 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a"} err="failed to get container status \"1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a\": rpc error: code = NotFound desc = could not find container \"1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a\": container with ID starting with 1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a not found: ID does not exist" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.601919 4699 scope.go:117] "RemoveContainer" containerID="85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5" Nov 22 04:18:53 crc kubenswrapper[4699]: E1122 04:18:53.602380 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5\": container with ID starting with 85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5 not found: ID does not exist" containerID="85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.602448 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5"} err="failed to get container status \"85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5\": rpc error: code = NotFound desc = could not find container \"85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5\": container with ID starting with 85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5 not found: ID does not exist" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.602475 4699 scope.go:117] "RemoveContainer" containerID="ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e" Nov 22 04:18:53 crc kubenswrapper[4699]: E1122 04:18:53.602918 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e\": container with ID starting with ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e not found: ID does not exist" containerID="ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.602968 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e"} err="failed to get container status \"ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e\": rpc error: code = NotFound desc = could not find container \"ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e\": container with ID starting with ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e not found: ID does not exist" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.602985 4699 scope.go:117] "RemoveContainer" containerID="823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d" Nov 22 04:18:53 crc kubenswrapper[4699]: E1122 04:18:53.603304 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d\": container with ID starting with 823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d not found: ID does not exist" containerID="823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.603385 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d"} err="failed to get container status \"823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d\": rpc error: code = NotFound desc = could not find container \"823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d\": container with ID starting with 823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d not found: ID does not exist" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.603481 4699 scope.go:117] "RemoveContainer" containerID="e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612" Nov 22 04:18:53 crc kubenswrapper[4699]: E1122 04:18:53.603901 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612\": container with ID starting with e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612 not found: ID does not exist" containerID="e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.603936 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612"} err="failed to get container status \"e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612\": rpc error: code = NotFound desc = could not find container \"e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612\": container with ID starting with e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612 not found: ID does not exist" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.603954 4699 scope.go:117] "RemoveContainer" containerID="df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710" Nov 22 04:18:53 crc kubenswrapper[4699]: E1122 04:18:53.604339 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710\": container with ID starting with df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710 not found: ID does not exist" containerID="df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.604390 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710"} err="failed to get container status \"df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710\": rpc error: code = NotFound desc = could not find container \"df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710\": container with ID starting with df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710 not found: ID does not exist" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.604460 4699 scope.go:117] "RemoveContainer" containerID="a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4" Nov 22 04:18:53 crc kubenswrapper[4699]: E1122 04:18:53.604787 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\": container with ID starting with a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4 not found: ID does not exist" containerID="a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.604829 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4"} err="failed to get container status \"a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\": rpc error: code = NotFound desc = could not find container \"a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\": container with ID starting with a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4 not found: ID does not exist" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.604855 4699 scope.go:117] "RemoveContainer" containerID="b673fbe125877625a87ad0ea5862032a63a472c737bac93e0e9be1c479112f3d" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.605197 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b673fbe125877625a87ad0ea5862032a63a472c737bac93e0e9be1c479112f3d"} err="failed to get container status \"b673fbe125877625a87ad0ea5862032a63a472c737bac93e0e9be1c479112f3d\": rpc error: code = NotFound desc = could not find container \"b673fbe125877625a87ad0ea5862032a63a472c737bac93e0e9be1c479112f3d\": container with ID starting with b673fbe125877625a87ad0ea5862032a63a472c737bac93e0e9be1c479112f3d not found: ID does not exist" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.605227 4699 scope.go:117] "RemoveContainer" containerID="bbb779ff19249c1428629a088a765868d3740d2e2ebbac18bdd170537da92af0" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.605523 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbb779ff19249c1428629a088a765868d3740d2e2ebbac18bdd170537da92af0"} err="failed to get container status \"bbb779ff19249c1428629a088a765868d3740d2e2ebbac18bdd170537da92af0\": rpc error: code = NotFound desc = could not find container \"bbb779ff19249c1428629a088a765868d3740d2e2ebbac18bdd170537da92af0\": container with ID starting with bbb779ff19249c1428629a088a765868d3740d2e2ebbac18bdd170537da92af0 not found: ID does not exist" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.605569 4699 scope.go:117] "RemoveContainer" containerID="c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.606026 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893"} err="failed to get container status \"c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893\": rpc error: code = NotFound desc = could not find container \"c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893\": container with ID starting with c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893 not found: ID does not exist" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.606054 4699 scope.go:117] "RemoveContainer" containerID="1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.606504 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a"} err="failed to get container status \"1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a\": rpc error: code = NotFound desc = could not find container \"1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a\": container with ID starting with 1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a not found: ID does not exist" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.606538 4699 scope.go:117] "RemoveContainer" containerID="85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.606836 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5"} err="failed to get container status \"85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5\": rpc error: code = NotFound desc = could not find container \"85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5\": container with ID starting with 85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5 not found: ID does not exist" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.606869 4699 scope.go:117] "RemoveContainer" containerID="ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.607302 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e"} err="failed to get container status \"ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e\": rpc error: code = NotFound desc = could not find container \"ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e\": container with ID starting with ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e not found: ID does not exist" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.607336 4699 scope.go:117] "RemoveContainer" containerID="823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.607727 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d"} err="failed to get container status \"823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d\": rpc error: code = NotFound desc = could not find container \"823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d\": container with ID starting with 823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d not found: ID does not exist" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.607762 4699 scope.go:117] "RemoveContainer" containerID="e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.608219 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612"} err="failed to get container status \"e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612\": rpc error: code = NotFound desc = could not find container \"e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612\": container with ID starting with e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612 not found: ID does not exist" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.608248 4699 scope.go:117] "RemoveContainer" containerID="df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.608641 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710"} err="failed to get container status \"df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710\": rpc error: code = NotFound desc = could not find container \"df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710\": container with ID starting with df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710 not found: ID does not exist" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.608675 4699 scope.go:117] "RemoveContainer" containerID="a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.609145 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4"} err="failed to get container status \"a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\": rpc error: code = NotFound desc = could not find container \"a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\": container with ID starting with a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4 not found: ID does not exist" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.609174 4699 scope.go:117] "RemoveContainer" containerID="b673fbe125877625a87ad0ea5862032a63a472c737bac93e0e9be1c479112f3d" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.609453 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b673fbe125877625a87ad0ea5862032a63a472c737bac93e0e9be1c479112f3d"} err="failed to get container status \"b673fbe125877625a87ad0ea5862032a63a472c737bac93e0e9be1c479112f3d\": rpc error: code = NotFound desc = could not find container \"b673fbe125877625a87ad0ea5862032a63a472c737bac93e0e9be1c479112f3d\": container with ID starting with b673fbe125877625a87ad0ea5862032a63a472c737bac93e0e9be1c479112f3d not found: ID does not exist" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.609483 4699 scope.go:117] "RemoveContainer" containerID="bbb779ff19249c1428629a088a765868d3740d2e2ebbac18bdd170537da92af0" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.609809 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbb779ff19249c1428629a088a765868d3740d2e2ebbac18bdd170537da92af0"} err="failed to get container status \"bbb779ff19249c1428629a088a765868d3740d2e2ebbac18bdd170537da92af0\": rpc error: code = NotFound desc = could not find container \"bbb779ff19249c1428629a088a765868d3740d2e2ebbac18bdd170537da92af0\": container with ID starting with bbb779ff19249c1428629a088a765868d3740d2e2ebbac18bdd170537da92af0 not found: ID does not exist" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.609828 4699 scope.go:117] "RemoveContainer" containerID="c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.610204 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893"} err="failed to get container status \"c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893\": rpc error: code = NotFound desc = could not find container \"c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893\": container with ID starting with c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893 not found: ID does not exist" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.610237 4699 scope.go:117] "RemoveContainer" containerID="1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.611072 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a"} err="failed to get container status \"1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a\": rpc error: code = NotFound desc = could not find container \"1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a\": container with ID starting with 1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a not found: ID does not exist" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.611117 4699 scope.go:117] "RemoveContainer" containerID="85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.611467 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5"} err="failed to get container status \"85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5\": rpc error: code = NotFound desc = could not find container \"85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5\": container with ID starting with 85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5 not found: ID does not exist" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.611504 4699 scope.go:117] "RemoveContainer" containerID="ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.611848 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e"} err="failed to get container status \"ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e\": rpc error: code = NotFound desc = could not find container \"ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e\": container with ID starting with ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e not found: ID does not exist" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.611882 4699 scope.go:117] "RemoveContainer" containerID="823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.612247 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d"} err="failed to get container status \"823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d\": rpc error: code = NotFound desc = could not find container \"823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d\": container with ID starting with 823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d not found: ID does not exist" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.612280 4699 scope.go:117] "RemoveContainer" containerID="e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.612651 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612"} err="failed to get container status \"e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612\": rpc error: code = NotFound desc = could not find container \"e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612\": container with ID starting with e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612 not found: ID does not exist" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.612679 4699 scope.go:117] "RemoveContainer" containerID="df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.613050 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710"} err="failed to get container status \"df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710\": rpc error: code = NotFound desc = could not find container \"df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710\": container with ID starting with df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710 not found: ID does not exist" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.613163 4699 scope.go:117] "RemoveContainer" containerID="a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.613617 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4"} err="failed to get container status \"a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\": rpc error: code = NotFound desc = could not find container \"a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\": container with ID starting with a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4 not found: ID does not exist" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.613642 4699 scope.go:117] "RemoveContainer" containerID="b673fbe125877625a87ad0ea5862032a63a472c737bac93e0e9be1c479112f3d" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.613883 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b673fbe125877625a87ad0ea5862032a63a472c737bac93e0e9be1c479112f3d"} err="failed to get container status \"b673fbe125877625a87ad0ea5862032a63a472c737bac93e0e9be1c479112f3d\": rpc error: code = NotFound desc = could not find container \"b673fbe125877625a87ad0ea5862032a63a472c737bac93e0e9be1c479112f3d\": container with ID starting with b673fbe125877625a87ad0ea5862032a63a472c737bac93e0e9be1c479112f3d not found: ID does not exist" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.613959 4699 scope.go:117] "RemoveContainer" containerID="bbb779ff19249c1428629a088a765868d3740d2e2ebbac18bdd170537da92af0" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.614294 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbb779ff19249c1428629a088a765868d3740d2e2ebbac18bdd170537da92af0"} err="failed to get container status \"bbb779ff19249c1428629a088a765868d3740d2e2ebbac18bdd170537da92af0\": rpc error: code = NotFound desc = could not find container \"bbb779ff19249c1428629a088a765868d3740d2e2ebbac18bdd170537da92af0\": container with ID starting with bbb779ff19249c1428629a088a765868d3740d2e2ebbac18bdd170537da92af0 not found: ID does not exist" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.614366 4699 scope.go:117] "RemoveContainer" containerID="c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.614720 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893"} err="failed to get container status \"c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893\": rpc error: code = NotFound desc = could not find container \"c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893\": container with ID starting with c6e92bdca528d18bd4178a24439a8687fcc6c32b925903404d59457758729893 not found: ID does not exist" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.614749 4699 scope.go:117] "RemoveContainer" containerID="1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.615251 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a"} err="failed to get container status \"1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a\": rpc error: code = NotFound desc = could not find container \"1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a\": container with ID starting with 1ae699be665a2c6e8dc69c20eaedb17d2718ea63bbd50c756fa2bbd338ddbc6a not found: ID does not exist" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.615286 4699 scope.go:117] "RemoveContainer" containerID="85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.615735 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5"} err="failed to get container status \"85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5\": rpc error: code = NotFound desc = could not find container \"85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5\": container with ID starting with 85147b61eeac671dd2983919a9460dd66429f0846f910d088a04c19250ec15d5 not found: ID does not exist" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.615838 4699 scope.go:117] "RemoveContainer" containerID="ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.616215 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e"} err="failed to get container status \"ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e\": rpc error: code = NotFound desc = could not find container \"ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e\": container with ID starting with ad3bd52fab7837099d8cd6905a6c50694f8375f78721e3e897240255c5b1907e not found: ID does not exist" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.616247 4699 scope.go:117] "RemoveContainer" containerID="823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.616594 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d"} err="failed to get container status \"823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d\": rpc error: code = NotFound desc = could not find container \"823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d\": container with ID starting with 823b5df48f3158d5c815838350f5a2c48100e845ef94325a9580cb875695560d not found: ID does not exist" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.616631 4699 scope.go:117] "RemoveContainer" containerID="e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.616945 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612"} err="failed to get container status \"e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612\": rpc error: code = NotFound desc = could not find container \"e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612\": container with ID starting with e73251a6c23d36a2bac69aec314d503982e5b6ced73c024277dbdad8cacba612 not found: ID does not exist" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.616979 4699 scope.go:117] "RemoveContainer" containerID="df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.617290 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710"} err="failed to get container status \"df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710\": rpc error: code = NotFound desc = could not find container \"df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710\": container with ID starting with df58f0d288c1a96557d090a4f26a1a53a7c83de90af7acb2e9b66961e6368710 not found: ID does not exist" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.617332 4699 scope.go:117] "RemoveContainer" containerID="a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4" Nov 22 04:18:53 crc kubenswrapper[4699]: I1122 04:18:53.617686 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4"} err="failed to get container status \"a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\": rpc error: code = NotFound desc = could not find container \"a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4\": container with ID starting with a7e9075e8d0c8c8fc859e51c7861343034e9dff7fb730f43ecfd99db0c101ff4 not found: ID does not exist" Nov 22 04:18:54 crc kubenswrapper[4699]: I1122 04:18:54.391024 4699 generic.go:334] "Generic (PLEG): container finished" podID="f81deb4d-348a-4b1b-9506-beff439c180b" containerID="b2f1e7f3f9e4cbd7a52809a3217ec7ebbaaea322003dc5b871c96c9024d8c18d" exitCode=0 Nov 22 04:18:54 crc kubenswrapper[4699]: I1122 04:18:54.391379 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" event={"ID":"f81deb4d-348a-4b1b-9506-beff439c180b","Type":"ContainerDied","Data":"b2f1e7f3f9e4cbd7a52809a3217ec7ebbaaea322003dc5b871c96c9024d8c18d"} Nov 22 04:18:54 crc kubenswrapper[4699]: I1122 04:18:54.392000 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" event={"ID":"f81deb4d-348a-4b1b-9506-beff439c180b","Type":"ContainerStarted","Data":"956de551daa6341ad8125ac844dc5f878aa9d00447c1e7dcd155ac372f2e397f"} Nov 22 04:18:54 crc kubenswrapper[4699]: I1122 04:18:54.394970 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pmtb4_c5f530d5-6f69-4838-a0dd-f4662ddbf85c/kube-multus/2.log" Nov 22 04:18:55 crc kubenswrapper[4699]: I1122 04:18:55.412703 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" event={"ID":"f81deb4d-348a-4b1b-9506-beff439c180b","Type":"ContainerStarted","Data":"bbea5b6a242280083af082a6c00e3eebf8a2504cc588bcf33fbcd24e461b5073"} Nov 22 04:18:55 crc kubenswrapper[4699]: I1122 04:18:55.413389 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" event={"ID":"f81deb4d-348a-4b1b-9506-beff439c180b","Type":"ContainerStarted","Data":"92755be227364e35373169a376cca89ff8a07408f2d2b6a4579ef3a3addd6a9f"} Nov 22 04:18:55 crc kubenswrapper[4699]: I1122 04:18:55.413411 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" event={"ID":"f81deb4d-348a-4b1b-9506-beff439c180b","Type":"ContainerStarted","Data":"bea490ddb21373a52036940529219798b7b2a7f2728829c6a32efbdb74469def"} Nov 22 04:18:55 crc kubenswrapper[4699]: I1122 04:18:55.413424 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" event={"ID":"f81deb4d-348a-4b1b-9506-beff439c180b","Type":"ContainerStarted","Data":"74b715862e68d2792313928b0f435b37ddc51069b0d04b0ae02ba115b216d8d4"} Nov 22 04:18:55 crc kubenswrapper[4699]: I1122 04:18:55.413465 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" event={"ID":"f81deb4d-348a-4b1b-9506-beff439c180b","Type":"ContainerStarted","Data":"1c77786dfbad98456efd648cbae7167d86f0aa170ca122946aabddb0e9f00013"} Nov 22 04:18:55 crc kubenswrapper[4699]: I1122 04:18:55.413478 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" event={"ID":"f81deb4d-348a-4b1b-9506-beff439c180b","Type":"ContainerStarted","Data":"66b22431b3dd05e1698510ca42b7094e93ceea6ca38b22f5878187f08638d4e7"} Nov 22 04:18:57 crc kubenswrapper[4699]: I1122 04:18:57.428050 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" event={"ID":"f81deb4d-348a-4b1b-9506-beff439c180b","Type":"ContainerStarted","Data":"cc5066fa03fd99113dced4e47ca37841e051c57b9615ac11a422e96467c0bb53"} Nov 22 04:19:00 crc kubenswrapper[4699]: I1122 04:19:00.448351 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" event={"ID":"f81deb4d-348a-4b1b-9506-beff439c180b","Type":"ContainerStarted","Data":"8d81f4928d29bf0820d6a829fa5a63bbb1b002be76ab53b1f3503d2577bd73ae"} Nov 22 04:19:00 crc kubenswrapper[4699]: I1122 04:19:00.449359 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:19:00 crc kubenswrapper[4699]: I1122 04:19:00.478852 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" podStartSLOduration=7.47882532 podStartE2EDuration="7.47882532s" podCreationTimestamp="2025-11-22 04:18:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:19:00.475789485 +0000 UTC m=+691.818410692" watchObservedRunningTime="2025-11-22 04:19:00.47882532 +0000 UTC m=+691.821446507" Nov 22 04:19:00 crc kubenswrapper[4699]: I1122 04:19:00.492054 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:19:01 crc kubenswrapper[4699]: I1122 04:19:01.456415 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:19:01 crc kubenswrapper[4699]: I1122 04:19:01.456470 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:19:01 crc kubenswrapper[4699]: I1122 04:19:01.483913 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:19:05 crc kubenswrapper[4699]: I1122 04:19:05.242668 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr"] Nov 22 04:19:05 crc kubenswrapper[4699]: I1122 04:19:05.244073 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr" Nov 22 04:19:05 crc kubenswrapper[4699]: I1122 04:19:05.248527 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 22 04:19:05 crc kubenswrapper[4699]: I1122 04:19:05.259496 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr"] Nov 22 04:19:05 crc kubenswrapper[4699]: I1122 04:19:05.353817 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jwsb\" (UniqueName: \"kubernetes.io/projected/724005e9-061b-46d4-84ce-611d0ddaa0e5-kube-api-access-5jwsb\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr\" (UID: \"724005e9-061b-46d4-84ce-611d0ddaa0e5\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr" Nov 22 04:19:05 crc kubenswrapper[4699]: I1122 04:19:05.353883 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/724005e9-061b-46d4-84ce-611d0ddaa0e5-util\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr\" (UID: \"724005e9-061b-46d4-84ce-611d0ddaa0e5\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr" Nov 22 04:19:05 crc kubenswrapper[4699]: I1122 04:19:05.353932 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/724005e9-061b-46d4-84ce-611d0ddaa0e5-bundle\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr\" (UID: \"724005e9-061b-46d4-84ce-611d0ddaa0e5\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr" Nov 22 04:19:05 crc kubenswrapper[4699]: I1122 04:19:05.454977 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/724005e9-061b-46d4-84ce-611d0ddaa0e5-bundle\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr\" (UID: \"724005e9-061b-46d4-84ce-611d0ddaa0e5\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr" Nov 22 04:19:05 crc kubenswrapper[4699]: I1122 04:19:05.455090 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jwsb\" (UniqueName: \"kubernetes.io/projected/724005e9-061b-46d4-84ce-611d0ddaa0e5-kube-api-access-5jwsb\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr\" (UID: \"724005e9-061b-46d4-84ce-611d0ddaa0e5\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr" Nov 22 04:19:05 crc kubenswrapper[4699]: I1122 04:19:05.455123 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/724005e9-061b-46d4-84ce-611d0ddaa0e5-util\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr\" (UID: \"724005e9-061b-46d4-84ce-611d0ddaa0e5\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr" Nov 22 04:19:05 crc kubenswrapper[4699]: I1122 04:19:05.455668 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/724005e9-061b-46d4-84ce-611d0ddaa0e5-bundle\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr\" (UID: \"724005e9-061b-46d4-84ce-611d0ddaa0e5\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr" Nov 22 04:19:05 crc kubenswrapper[4699]: I1122 04:19:05.455766 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/724005e9-061b-46d4-84ce-611d0ddaa0e5-util\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr\" (UID: \"724005e9-061b-46d4-84ce-611d0ddaa0e5\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr" Nov 22 04:19:05 crc kubenswrapper[4699]: I1122 04:19:05.477548 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jwsb\" (UniqueName: \"kubernetes.io/projected/724005e9-061b-46d4-84ce-611d0ddaa0e5-kube-api-access-5jwsb\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr\" (UID: \"724005e9-061b-46d4-84ce-611d0ddaa0e5\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr" Nov 22 04:19:05 crc kubenswrapper[4699]: I1122 04:19:05.563724 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr" Nov 22 04:19:05 crc kubenswrapper[4699]: E1122 04:19:05.603168 4699 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr_openshift-marketplace_724005e9-061b-46d4-84ce-611d0ddaa0e5_0(8e089f40df2cacb96eb8b8153fe064b40e06fbef3930500af0ccf408e7e338a9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 22 04:19:05 crc kubenswrapper[4699]: E1122 04:19:05.603248 4699 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr_openshift-marketplace_724005e9-061b-46d4-84ce-611d0ddaa0e5_0(8e089f40df2cacb96eb8b8153fe064b40e06fbef3930500af0ccf408e7e338a9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr" Nov 22 04:19:05 crc kubenswrapper[4699]: E1122 04:19:05.603273 4699 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr_openshift-marketplace_724005e9-061b-46d4-84ce-611d0ddaa0e5_0(8e089f40df2cacb96eb8b8153fe064b40e06fbef3930500af0ccf408e7e338a9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr" Nov 22 04:19:05 crc kubenswrapper[4699]: E1122 04:19:05.603328 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr_openshift-marketplace(724005e9-061b-46d4-84ce-611d0ddaa0e5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr_openshift-marketplace(724005e9-061b-46d4-84ce-611d0ddaa0e5)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr_openshift-marketplace_724005e9-061b-46d4-84ce-611d0ddaa0e5_0(8e089f40df2cacb96eb8b8153fe064b40e06fbef3930500af0ccf408e7e338a9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr" podUID="724005e9-061b-46d4-84ce-611d0ddaa0e5" Nov 22 04:19:06 crc kubenswrapper[4699]: I1122 04:19:06.477514 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr" Nov 22 04:19:06 crc kubenswrapper[4699]: I1122 04:19:06.477902 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr" Nov 22 04:19:06 crc kubenswrapper[4699]: E1122 04:19:06.497757 4699 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr_openshift-marketplace_724005e9-061b-46d4-84ce-611d0ddaa0e5_0(1e8cbc74e7d9db109382978605090881c0cc3e255fddf53ef6d6938b812705f1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 22 04:19:06 crc kubenswrapper[4699]: E1122 04:19:06.497814 4699 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr_openshift-marketplace_724005e9-061b-46d4-84ce-611d0ddaa0e5_0(1e8cbc74e7d9db109382978605090881c0cc3e255fddf53ef6d6938b812705f1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr" Nov 22 04:19:06 crc kubenswrapper[4699]: E1122 04:19:06.497835 4699 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr_openshift-marketplace_724005e9-061b-46d4-84ce-611d0ddaa0e5_0(1e8cbc74e7d9db109382978605090881c0cc3e255fddf53ef6d6938b812705f1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr" Nov 22 04:19:06 crc kubenswrapper[4699]: E1122 04:19:06.497883 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr_openshift-marketplace(724005e9-061b-46d4-84ce-611d0ddaa0e5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr_openshift-marketplace(724005e9-061b-46d4-84ce-611d0ddaa0e5)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr_openshift-marketplace_724005e9-061b-46d4-84ce-611d0ddaa0e5_0(1e8cbc74e7d9db109382978605090881c0cc3e255fddf53ef6d6938b812705f1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr" podUID="724005e9-061b-46d4-84ce-611d0ddaa0e5" Nov 22 04:19:08 crc kubenswrapper[4699]: I1122 04:19:08.447162 4699 scope.go:117] "RemoveContainer" containerID="ffb362e6b86a26120532d834f084b64ff7f8e82585292b537741c72e7d426e3b" Nov 22 04:19:08 crc kubenswrapper[4699]: E1122 04:19:08.447539 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-pmtb4_openshift-multus(c5f530d5-6f69-4838-a0dd-f4662ddbf85c)\"" pod="openshift-multus/multus-pmtb4" podUID="c5f530d5-6f69-4838-a0dd-f4662ddbf85c" Nov 22 04:19:17 crc kubenswrapper[4699]: I1122 04:19:17.448241 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr" Nov 22 04:19:17 crc kubenswrapper[4699]: I1122 04:19:17.449210 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr" Nov 22 04:19:17 crc kubenswrapper[4699]: E1122 04:19:17.486494 4699 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr_openshift-marketplace_724005e9-061b-46d4-84ce-611d0ddaa0e5_0(5ed9d1f14792d492d2c4f3c573ad1f96d1b0d25a45d83c73f44f63a52cb9f039): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 22 04:19:17 crc kubenswrapper[4699]: E1122 04:19:17.486912 4699 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr_openshift-marketplace_724005e9-061b-46d4-84ce-611d0ddaa0e5_0(5ed9d1f14792d492d2c4f3c573ad1f96d1b0d25a45d83c73f44f63a52cb9f039): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr" Nov 22 04:19:17 crc kubenswrapper[4699]: E1122 04:19:17.486949 4699 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr_openshift-marketplace_724005e9-061b-46d4-84ce-611d0ddaa0e5_0(5ed9d1f14792d492d2c4f3c573ad1f96d1b0d25a45d83c73f44f63a52cb9f039): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr" Nov 22 04:19:17 crc kubenswrapper[4699]: E1122 04:19:17.487015 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr_openshift-marketplace(724005e9-061b-46d4-84ce-611d0ddaa0e5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr_openshift-marketplace(724005e9-061b-46d4-84ce-611d0ddaa0e5)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr_openshift-marketplace_724005e9-061b-46d4-84ce-611d0ddaa0e5_0(5ed9d1f14792d492d2c4f3c573ad1f96d1b0d25a45d83c73f44f63a52cb9f039): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr" podUID="724005e9-061b-46d4-84ce-611d0ddaa0e5" Nov 22 04:19:19 crc kubenswrapper[4699]: I1122 04:19:19.450427 4699 scope.go:117] "RemoveContainer" containerID="ffb362e6b86a26120532d834f084b64ff7f8e82585292b537741c72e7d426e3b" Nov 22 04:19:20 crc kubenswrapper[4699]: I1122 04:19:20.557263 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pmtb4_c5f530d5-6f69-4838-a0dd-f4662ddbf85c/kube-multus/2.log" Nov 22 04:19:20 crc kubenswrapper[4699]: I1122 04:19:20.557502 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pmtb4" event={"ID":"c5f530d5-6f69-4838-a0dd-f4662ddbf85c","Type":"ContainerStarted","Data":"b9275eb52cb8c76179c34bdb9700ef8550e77727c7e8f0818e3f5cc2b1d822ad"} Nov 22 04:19:23 crc kubenswrapper[4699]: I1122 04:19:23.513652 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-grlgt" Nov 22 04:19:32 crc kubenswrapper[4699]: I1122 04:19:32.446850 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr" Nov 22 04:19:32 crc kubenswrapper[4699]: I1122 04:19:32.447745 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr" Nov 22 04:19:32 crc kubenswrapper[4699]: I1122 04:19:32.643295 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr"] Nov 22 04:19:33 crc kubenswrapper[4699]: I1122 04:19:33.646282 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr" event={"ID":"724005e9-061b-46d4-84ce-611d0ddaa0e5","Type":"ContainerStarted","Data":"c9d86abfe7a52dc06e30f148ac7f4ed5902744e27d2ec065867798f1f6f5fede"} Nov 22 04:19:33 crc kubenswrapper[4699]: I1122 04:19:33.646632 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr" event={"ID":"724005e9-061b-46d4-84ce-611d0ddaa0e5","Type":"ContainerStarted","Data":"bed92a300e670c082992becdc21528efed69893c739fec706c358c6ea9dbe0f1"} Nov 22 04:19:34 crc kubenswrapper[4699]: I1122 04:19:34.654374 4699 generic.go:334] "Generic (PLEG): container finished" podID="724005e9-061b-46d4-84ce-611d0ddaa0e5" containerID="c9d86abfe7a52dc06e30f148ac7f4ed5902744e27d2ec065867798f1f6f5fede" exitCode=0 Nov 22 04:19:34 crc kubenswrapper[4699]: I1122 04:19:34.654447 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr" event={"ID":"724005e9-061b-46d4-84ce-611d0ddaa0e5","Type":"ContainerDied","Data":"c9d86abfe7a52dc06e30f148ac7f4ed5902744e27d2ec065867798f1f6f5fede"} Nov 22 04:19:38 crc kubenswrapper[4699]: I1122 04:19:38.684073 4699 generic.go:334] "Generic (PLEG): container finished" podID="724005e9-061b-46d4-84ce-611d0ddaa0e5" containerID="357dbe0c7f2df181e399fd1f8c5082ebe165ad093e3bb2eeddfdd2229d81e8ae" exitCode=0 Nov 22 04:19:38 crc kubenswrapper[4699]: I1122 04:19:38.684175 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr" event={"ID":"724005e9-061b-46d4-84ce-611d0ddaa0e5","Type":"ContainerDied","Data":"357dbe0c7f2df181e399fd1f8c5082ebe165ad093e3bb2eeddfdd2229d81e8ae"} Nov 22 04:19:39 crc kubenswrapper[4699]: I1122 04:19:39.691496 4699 generic.go:334] "Generic (PLEG): container finished" podID="724005e9-061b-46d4-84ce-611d0ddaa0e5" containerID="3f052ebf2cc397f74552b51f467bbd4ba14654fb1891d0c1a313c3a2f26e516e" exitCode=0 Nov 22 04:19:39 crc kubenswrapper[4699]: I1122 04:19:39.691597 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr" event={"ID":"724005e9-061b-46d4-84ce-611d0ddaa0e5","Type":"ContainerDied","Data":"3f052ebf2cc397f74552b51f467bbd4ba14654fb1891d0c1a313c3a2f26e516e"} Nov 22 04:19:40 crc kubenswrapper[4699]: I1122 04:19:40.880084 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr" Nov 22 04:19:41 crc kubenswrapper[4699]: I1122 04:19:41.039086 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/724005e9-061b-46d4-84ce-611d0ddaa0e5-util\") pod \"724005e9-061b-46d4-84ce-611d0ddaa0e5\" (UID: \"724005e9-061b-46d4-84ce-611d0ddaa0e5\") " Nov 22 04:19:41 crc kubenswrapper[4699]: I1122 04:19:41.039181 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jwsb\" (UniqueName: \"kubernetes.io/projected/724005e9-061b-46d4-84ce-611d0ddaa0e5-kube-api-access-5jwsb\") pod \"724005e9-061b-46d4-84ce-611d0ddaa0e5\" (UID: \"724005e9-061b-46d4-84ce-611d0ddaa0e5\") " Nov 22 04:19:41 crc kubenswrapper[4699]: I1122 04:19:41.039207 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/724005e9-061b-46d4-84ce-611d0ddaa0e5-bundle\") pod \"724005e9-061b-46d4-84ce-611d0ddaa0e5\" (UID: \"724005e9-061b-46d4-84ce-611d0ddaa0e5\") " Nov 22 04:19:41 crc kubenswrapper[4699]: I1122 04:19:41.040180 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/724005e9-061b-46d4-84ce-611d0ddaa0e5-bundle" (OuterVolumeSpecName: "bundle") pod "724005e9-061b-46d4-84ce-611d0ddaa0e5" (UID: "724005e9-061b-46d4-84ce-611d0ddaa0e5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:19:41 crc kubenswrapper[4699]: I1122 04:19:41.044631 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/724005e9-061b-46d4-84ce-611d0ddaa0e5-kube-api-access-5jwsb" (OuterVolumeSpecName: "kube-api-access-5jwsb") pod "724005e9-061b-46d4-84ce-611d0ddaa0e5" (UID: "724005e9-061b-46d4-84ce-611d0ddaa0e5"). InnerVolumeSpecName "kube-api-access-5jwsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:19:41 crc kubenswrapper[4699]: I1122 04:19:41.052154 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/724005e9-061b-46d4-84ce-611d0ddaa0e5-util" (OuterVolumeSpecName: "util") pod "724005e9-061b-46d4-84ce-611d0ddaa0e5" (UID: "724005e9-061b-46d4-84ce-611d0ddaa0e5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:19:41 crc kubenswrapper[4699]: I1122 04:19:41.140356 4699 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/724005e9-061b-46d4-84ce-611d0ddaa0e5-util\") on node \"crc\" DevicePath \"\"" Nov 22 04:19:41 crc kubenswrapper[4699]: I1122 04:19:41.140734 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jwsb\" (UniqueName: \"kubernetes.io/projected/724005e9-061b-46d4-84ce-611d0ddaa0e5-kube-api-access-5jwsb\") on node \"crc\" DevicePath \"\"" Nov 22 04:19:41 crc kubenswrapper[4699]: I1122 04:19:41.140749 4699 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/724005e9-061b-46d4-84ce-611d0ddaa0e5-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:19:41 crc kubenswrapper[4699]: I1122 04:19:41.707582 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr" event={"ID":"724005e9-061b-46d4-84ce-611d0ddaa0e5","Type":"ContainerDied","Data":"bed92a300e670c082992becdc21528efed69893c739fec706c358c6ea9dbe0f1"} Nov 22 04:19:41 crc kubenswrapper[4699]: I1122 04:19:41.707638 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr" Nov 22 04:19:41 crc kubenswrapper[4699]: I1122 04:19:41.707648 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bed92a300e670c082992becdc21528efed69893c739fec706c358c6ea9dbe0f1" Nov 22 04:19:47 crc kubenswrapper[4699]: I1122 04:19:47.012228 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-557fdffb88-6nj2r"] Nov 22 04:19:47 crc kubenswrapper[4699]: E1122 04:19:47.012722 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="724005e9-061b-46d4-84ce-611d0ddaa0e5" containerName="pull" Nov 22 04:19:47 crc kubenswrapper[4699]: I1122 04:19:47.012737 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="724005e9-061b-46d4-84ce-611d0ddaa0e5" containerName="pull" Nov 22 04:19:47 crc kubenswrapper[4699]: E1122 04:19:47.012747 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="724005e9-061b-46d4-84ce-611d0ddaa0e5" containerName="extract" Nov 22 04:19:47 crc kubenswrapper[4699]: I1122 04:19:47.012755 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="724005e9-061b-46d4-84ce-611d0ddaa0e5" containerName="extract" Nov 22 04:19:47 crc kubenswrapper[4699]: E1122 04:19:47.012768 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="724005e9-061b-46d4-84ce-611d0ddaa0e5" containerName="util" Nov 22 04:19:47 crc kubenswrapper[4699]: I1122 04:19:47.012779 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="724005e9-061b-46d4-84ce-611d0ddaa0e5" containerName="util" Nov 22 04:19:47 crc kubenswrapper[4699]: I1122 04:19:47.012914 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="724005e9-061b-46d4-84ce-611d0ddaa0e5" containerName="extract" Nov 22 04:19:47 crc kubenswrapper[4699]: I1122 04:19:47.013339 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-557fdffb88-6nj2r" Nov 22 04:19:47 crc kubenswrapper[4699]: I1122 04:19:47.015604 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Nov 22 04:19:47 crc kubenswrapper[4699]: I1122 04:19:47.015653 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-fmqh2" Nov 22 04:19:47 crc kubenswrapper[4699]: I1122 04:19:47.016501 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Nov 22 04:19:47 crc kubenswrapper[4699]: I1122 04:19:47.023072 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-557fdffb88-6nj2r"] Nov 22 04:19:47 crc kubenswrapper[4699]: I1122 04:19:47.112856 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76wrn\" (UniqueName: \"kubernetes.io/projected/a7e3ac11-e456-48a4-ad00-114a41462661-kube-api-access-76wrn\") pod \"nmstate-operator-557fdffb88-6nj2r\" (UID: \"a7e3ac11-e456-48a4-ad00-114a41462661\") " pod="openshift-nmstate/nmstate-operator-557fdffb88-6nj2r" Nov 22 04:19:47 crc kubenswrapper[4699]: I1122 04:19:47.214222 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76wrn\" (UniqueName: \"kubernetes.io/projected/a7e3ac11-e456-48a4-ad00-114a41462661-kube-api-access-76wrn\") pod \"nmstate-operator-557fdffb88-6nj2r\" (UID: \"a7e3ac11-e456-48a4-ad00-114a41462661\") " pod="openshift-nmstate/nmstate-operator-557fdffb88-6nj2r" Nov 22 04:19:47 crc kubenswrapper[4699]: I1122 04:19:47.236414 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76wrn\" (UniqueName: \"kubernetes.io/projected/a7e3ac11-e456-48a4-ad00-114a41462661-kube-api-access-76wrn\") pod \"nmstate-operator-557fdffb88-6nj2r\" (UID: \"a7e3ac11-e456-48a4-ad00-114a41462661\") " pod="openshift-nmstate/nmstate-operator-557fdffb88-6nj2r" Nov 22 04:19:47 crc kubenswrapper[4699]: I1122 04:19:47.328185 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-557fdffb88-6nj2r" Nov 22 04:19:47 crc kubenswrapper[4699]: I1122 04:19:47.549581 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-557fdffb88-6nj2r"] Nov 22 04:19:47 crc kubenswrapper[4699]: I1122 04:19:47.751608 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-557fdffb88-6nj2r" event={"ID":"a7e3ac11-e456-48a4-ad00-114a41462661","Type":"ContainerStarted","Data":"fa6e06cf510acfd54ce4149d39c1060b970d0612644a3118167f0a0aa643b081"} Nov 22 04:19:50 crc kubenswrapper[4699]: I1122 04:19:50.769964 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-557fdffb88-6nj2r" event={"ID":"a7e3ac11-e456-48a4-ad00-114a41462661","Type":"ContainerStarted","Data":"ca8d5ef963da57b15687a4e0693f342b3c6a1095c72b7bbdad80c633beeaf7d8"} Nov 22 04:19:50 crc kubenswrapper[4699]: I1122 04:19:50.793710 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-557fdffb88-6nj2r" podStartSLOduration=2.286224394 podStartE2EDuration="4.793685618s" podCreationTimestamp="2025-11-22 04:19:46 +0000 UTC" firstStartedPulling="2025-11-22 04:19:47.55996248 +0000 UTC m=+738.902583667" lastFinishedPulling="2025-11-22 04:19:50.067423704 +0000 UTC m=+741.410044891" observedRunningTime="2025-11-22 04:19:50.789536596 +0000 UTC m=+742.132157823" watchObservedRunningTime="2025-11-22 04:19:50.793685618 +0000 UTC m=+742.136306825" Nov 22 04:19:55 crc kubenswrapper[4699]: I1122 04:19:55.921730 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-5dcf9c57c5-r67mk"] Nov 22 04:19:55 crc kubenswrapper[4699]: I1122 04:19:55.923566 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-r67mk" Nov 22 04:19:55 crc kubenswrapper[4699]: I1122 04:19:55.925587 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6b89b748d8-7qwpr"] Nov 22 04:19:55 crc kubenswrapper[4699]: I1122 04:19:55.926277 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-9mv6n" Nov 22 04:19:55 crc kubenswrapper[4699]: I1122 04:19:55.926415 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-7qwpr" Nov 22 04:19:55 crc kubenswrapper[4699]: I1122 04:19:55.927762 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Nov 22 04:19:55 crc kubenswrapper[4699]: I1122 04:19:55.976927 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6b89b748d8-7qwpr"] Nov 22 04:19:55 crc kubenswrapper[4699]: I1122 04:19:55.983066 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-5dcf9c57c5-r67mk"] Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.001560 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-429gd"] Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.002389 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-429gd" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.017297 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/0d107f7a-e965-41e9-8ceb-5c5ac1c3b530-tls-key-pair\") pod \"nmstate-webhook-6b89b748d8-7qwpr\" (UID: \"0d107f7a-e965-41e9-8ceb-5c5ac1c3b530\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-7qwpr" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.017346 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/558d67eb-e4e4-46ca-bc65-8e4d568f4037-nmstate-lock\") pod \"nmstate-handler-429gd\" (UID: \"558d67eb-e4e4-46ca-bc65-8e4d568f4037\") " pod="openshift-nmstate/nmstate-handler-429gd" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.017394 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jm8cr\" (UniqueName: \"kubernetes.io/projected/0e1f9c73-89fc-4ab2-aca3-004315167c79-kube-api-access-jm8cr\") pod \"nmstate-metrics-5dcf9c57c5-r67mk\" (UID: \"0e1f9c73-89fc-4ab2-aca3-004315167c79\") " pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-r67mk" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.017419 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cmsx\" (UniqueName: \"kubernetes.io/projected/0d107f7a-e965-41e9-8ceb-5c5ac1c3b530-kube-api-access-6cmsx\") pod \"nmstate-webhook-6b89b748d8-7qwpr\" (UID: \"0d107f7a-e965-41e9-8ceb-5c5ac1c3b530\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-7qwpr" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.017474 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/558d67eb-e4e4-46ca-bc65-8e4d568f4037-ovs-socket\") pod \"nmstate-handler-429gd\" (UID: \"558d67eb-e4e4-46ca-bc65-8e4d568f4037\") " pod="openshift-nmstate/nmstate-handler-429gd" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.017501 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/558d67eb-e4e4-46ca-bc65-8e4d568f4037-dbus-socket\") pod \"nmstate-handler-429gd\" (UID: \"558d67eb-e4e4-46ca-bc65-8e4d568f4037\") " pod="openshift-nmstate/nmstate-handler-429gd" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.017524 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nwzc\" (UniqueName: \"kubernetes.io/projected/558d67eb-e4e4-46ca-bc65-8e4d568f4037-kube-api-access-8nwzc\") pod \"nmstate-handler-429gd\" (UID: \"558d67eb-e4e4-46ca-bc65-8e4d568f4037\") " pod="openshift-nmstate/nmstate-handler-429gd" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.082399 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5874bd7bc5-r7xcp"] Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.083295 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-r7xcp" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.085238 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.085460 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.085588 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-wcrsd" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.093967 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5874bd7bc5-r7xcp"] Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.119137 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/558d67eb-e4e4-46ca-bc65-8e4d568f4037-nmstate-lock\") pod \"nmstate-handler-429gd\" (UID: \"558d67eb-e4e4-46ca-bc65-8e4d568f4037\") " pod="openshift-nmstate/nmstate-handler-429gd" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.119212 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wwgp\" (UniqueName: \"kubernetes.io/projected/55b4d3a4-d0be-4184-b9ee-efedf1c27608-kube-api-access-9wwgp\") pod \"nmstate-console-plugin-5874bd7bc5-r7xcp\" (UID: \"55b4d3a4-d0be-4184-b9ee-efedf1c27608\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-r7xcp" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.119243 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/55b4d3a4-d0be-4184-b9ee-efedf1c27608-nginx-conf\") pod \"nmstate-console-plugin-5874bd7bc5-r7xcp\" (UID: \"55b4d3a4-d0be-4184-b9ee-efedf1c27608\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-r7xcp" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.119273 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jm8cr\" (UniqueName: \"kubernetes.io/projected/0e1f9c73-89fc-4ab2-aca3-004315167c79-kube-api-access-jm8cr\") pod \"nmstate-metrics-5dcf9c57c5-r67mk\" (UID: \"0e1f9c73-89fc-4ab2-aca3-004315167c79\") " pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-r67mk" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.119301 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cmsx\" (UniqueName: \"kubernetes.io/projected/0d107f7a-e965-41e9-8ceb-5c5ac1c3b530-kube-api-access-6cmsx\") pod \"nmstate-webhook-6b89b748d8-7qwpr\" (UID: \"0d107f7a-e965-41e9-8ceb-5c5ac1c3b530\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-7qwpr" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.119327 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/558d67eb-e4e4-46ca-bc65-8e4d568f4037-nmstate-lock\") pod \"nmstate-handler-429gd\" (UID: \"558d67eb-e4e4-46ca-bc65-8e4d568f4037\") " pod="openshift-nmstate/nmstate-handler-429gd" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.119349 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/55b4d3a4-d0be-4184-b9ee-efedf1c27608-plugin-serving-cert\") pod \"nmstate-console-plugin-5874bd7bc5-r7xcp\" (UID: \"55b4d3a4-d0be-4184-b9ee-efedf1c27608\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-r7xcp" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.119450 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/558d67eb-e4e4-46ca-bc65-8e4d568f4037-ovs-socket\") pod \"nmstate-handler-429gd\" (UID: \"558d67eb-e4e4-46ca-bc65-8e4d568f4037\") " pod="openshift-nmstate/nmstate-handler-429gd" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.119493 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/558d67eb-e4e4-46ca-bc65-8e4d568f4037-dbus-socket\") pod \"nmstate-handler-429gd\" (UID: \"558d67eb-e4e4-46ca-bc65-8e4d568f4037\") " pod="openshift-nmstate/nmstate-handler-429gd" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.119518 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nwzc\" (UniqueName: \"kubernetes.io/projected/558d67eb-e4e4-46ca-bc65-8e4d568f4037-kube-api-access-8nwzc\") pod \"nmstate-handler-429gd\" (UID: \"558d67eb-e4e4-46ca-bc65-8e4d568f4037\") " pod="openshift-nmstate/nmstate-handler-429gd" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.119536 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/558d67eb-e4e4-46ca-bc65-8e4d568f4037-ovs-socket\") pod \"nmstate-handler-429gd\" (UID: \"558d67eb-e4e4-46ca-bc65-8e4d568f4037\") " pod="openshift-nmstate/nmstate-handler-429gd" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.119668 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/0d107f7a-e965-41e9-8ceb-5c5ac1c3b530-tls-key-pair\") pod \"nmstate-webhook-6b89b748d8-7qwpr\" (UID: \"0d107f7a-e965-41e9-8ceb-5c5ac1c3b530\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-7qwpr" Nov 22 04:19:56 crc kubenswrapper[4699]: E1122 04:19:56.119819 4699 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Nov 22 04:19:56 crc kubenswrapper[4699]: E1122 04:19:56.119862 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d107f7a-e965-41e9-8ceb-5c5ac1c3b530-tls-key-pair podName:0d107f7a-e965-41e9-8ceb-5c5ac1c3b530 nodeName:}" failed. No retries permitted until 2025-11-22 04:19:56.619847324 +0000 UTC m=+747.962468511 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/0d107f7a-e965-41e9-8ceb-5c5ac1c3b530-tls-key-pair") pod "nmstate-webhook-6b89b748d8-7qwpr" (UID: "0d107f7a-e965-41e9-8ceb-5c5ac1c3b530") : secret "openshift-nmstate-webhook" not found Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.119928 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/558d67eb-e4e4-46ca-bc65-8e4d568f4037-dbus-socket\") pod \"nmstate-handler-429gd\" (UID: \"558d67eb-e4e4-46ca-bc65-8e4d568f4037\") " pod="openshift-nmstate/nmstate-handler-429gd" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.143265 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cmsx\" (UniqueName: \"kubernetes.io/projected/0d107f7a-e965-41e9-8ceb-5c5ac1c3b530-kube-api-access-6cmsx\") pod \"nmstate-webhook-6b89b748d8-7qwpr\" (UID: \"0d107f7a-e965-41e9-8ceb-5c5ac1c3b530\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-7qwpr" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.143374 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nwzc\" (UniqueName: \"kubernetes.io/projected/558d67eb-e4e4-46ca-bc65-8e4d568f4037-kube-api-access-8nwzc\") pod \"nmstate-handler-429gd\" (UID: \"558d67eb-e4e4-46ca-bc65-8e4d568f4037\") " pod="openshift-nmstate/nmstate-handler-429gd" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.160836 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jm8cr\" (UniqueName: \"kubernetes.io/projected/0e1f9c73-89fc-4ab2-aca3-004315167c79-kube-api-access-jm8cr\") pod \"nmstate-metrics-5dcf9c57c5-r67mk\" (UID: \"0e1f9c73-89fc-4ab2-aca3-004315167c79\") " pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-r67mk" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.220486 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wwgp\" (UniqueName: \"kubernetes.io/projected/55b4d3a4-d0be-4184-b9ee-efedf1c27608-kube-api-access-9wwgp\") pod \"nmstate-console-plugin-5874bd7bc5-r7xcp\" (UID: \"55b4d3a4-d0be-4184-b9ee-efedf1c27608\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-r7xcp" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.220544 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/55b4d3a4-d0be-4184-b9ee-efedf1c27608-nginx-conf\") pod \"nmstate-console-plugin-5874bd7bc5-r7xcp\" (UID: \"55b4d3a4-d0be-4184-b9ee-efedf1c27608\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-r7xcp" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.220582 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/55b4d3a4-d0be-4184-b9ee-efedf1c27608-plugin-serving-cert\") pod \"nmstate-console-plugin-5874bd7bc5-r7xcp\" (UID: \"55b4d3a4-d0be-4184-b9ee-efedf1c27608\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-r7xcp" Nov 22 04:19:56 crc kubenswrapper[4699]: E1122 04:19:56.220771 4699 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Nov 22 04:19:56 crc kubenswrapper[4699]: E1122 04:19:56.220820 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55b4d3a4-d0be-4184-b9ee-efedf1c27608-plugin-serving-cert podName:55b4d3a4-d0be-4184-b9ee-efedf1c27608 nodeName:}" failed. No retries permitted until 2025-11-22 04:19:56.720806454 +0000 UTC m=+748.063427641 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/55b4d3a4-d0be-4184-b9ee-efedf1c27608-plugin-serving-cert") pod "nmstate-console-plugin-5874bd7bc5-r7xcp" (UID: "55b4d3a4-d0be-4184-b9ee-efedf1c27608") : secret "plugin-serving-cert" not found Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.222144 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/55b4d3a4-d0be-4184-b9ee-efedf1c27608-nginx-conf\") pod \"nmstate-console-plugin-5874bd7bc5-r7xcp\" (UID: \"55b4d3a4-d0be-4184-b9ee-efedf1c27608\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-r7xcp" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.258732 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wwgp\" (UniqueName: \"kubernetes.io/projected/55b4d3a4-d0be-4184-b9ee-efedf1c27608-kube-api-access-9wwgp\") pod \"nmstate-console-plugin-5874bd7bc5-r7xcp\" (UID: \"55b4d3a4-d0be-4184-b9ee-efedf1c27608\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-r7xcp" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.278897 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-r67mk" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.307220 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-58f94ddcc-l6rch"] Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.308050 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58f94ddcc-l6rch" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.327746 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-429gd" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.334848 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-58f94ddcc-l6rch"] Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.424218 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/56bfd8c7-3a9e-4e62-842e-f26f40d978ca-console-config\") pod \"console-58f94ddcc-l6rch\" (UID: \"56bfd8c7-3a9e-4e62-842e-f26f40d978ca\") " pod="openshift-console/console-58f94ddcc-l6rch" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.424287 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5775\" (UniqueName: \"kubernetes.io/projected/56bfd8c7-3a9e-4e62-842e-f26f40d978ca-kube-api-access-l5775\") pod \"console-58f94ddcc-l6rch\" (UID: \"56bfd8c7-3a9e-4e62-842e-f26f40d978ca\") " pod="openshift-console/console-58f94ddcc-l6rch" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.424354 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56bfd8c7-3a9e-4e62-842e-f26f40d978ca-trusted-ca-bundle\") pod \"console-58f94ddcc-l6rch\" (UID: \"56bfd8c7-3a9e-4e62-842e-f26f40d978ca\") " pod="openshift-console/console-58f94ddcc-l6rch" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.424377 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/56bfd8c7-3a9e-4e62-842e-f26f40d978ca-oauth-serving-cert\") pod \"console-58f94ddcc-l6rch\" (UID: \"56bfd8c7-3a9e-4e62-842e-f26f40d978ca\") " pod="openshift-console/console-58f94ddcc-l6rch" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.424398 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/56bfd8c7-3a9e-4e62-842e-f26f40d978ca-console-serving-cert\") pod \"console-58f94ddcc-l6rch\" (UID: \"56bfd8c7-3a9e-4e62-842e-f26f40d978ca\") " pod="openshift-console/console-58f94ddcc-l6rch" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.424415 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/56bfd8c7-3a9e-4e62-842e-f26f40d978ca-service-ca\") pod \"console-58f94ddcc-l6rch\" (UID: \"56bfd8c7-3a9e-4e62-842e-f26f40d978ca\") " pod="openshift-console/console-58f94ddcc-l6rch" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.424504 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/56bfd8c7-3a9e-4e62-842e-f26f40d978ca-console-oauth-config\") pod \"console-58f94ddcc-l6rch\" (UID: \"56bfd8c7-3a9e-4e62-842e-f26f40d978ca\") " pod="openshift-console/console-58f94ddcc-l6rch" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.515630 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-5dcf9c57c5-r67mk"] Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.525208 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/56bfd8c7-3a9e-4e62-842e-f26f40d978ca-console-oauth-config\") pod \"console-58f94ddcc-l6rch\" (UID: \"56bfd8c7-3a9e-4e62-842e-f26f40d978ca\") " pod="openshift-console/console-58f94ddcc-l6rch" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.525287 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/56bfd8c7-3a9e-4e62-842e-f26f40d978ca-console-config\") pod \"console-58f94ddcc-l6rch\" (UID: \"56bfd8c7-3a9e-4e62-842e-f26f40d978ca\") " pod="openshift-console/console-58f94ddcc-l6rch" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.525323 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5775\" (UniqueName: \"kubernetes.io/projected/56bfd8c7-3a9e-4e62-842e-f26f40d978ca-kube-api-access-l5775\") pod \"console-58f94ddcc-l6rch\" (UID: \"56bfd8c7-3a9e-4e62-842e-f26f40d978ca\") " pod="openshift-console/console-58f94ddcc-l6rch" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.525375 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56bfd8c7-3a9e-4e62-842e-f26f40d978ca-trusted-ca-bundle\") pod \"console-58f94ddcc-l6rch\" (UID: \"56bfd8c7-3a9e-4e62-842e-f26f40d978ca\") " pod="openshift-console/console-58f94ddcc-l6rch" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.525399 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/56bfd8c7-3a9e-4e62-842e-f26f40d978ca-oauth-serving-cert\") pod \"console-58f94ddcc-l6rch\" (UID: \"56bfd8c7-3a9e-4e62-842e-f26f40d978ca\") " pod="openshift-console/console-58f94ddcc-l6rch" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.525425 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/56bfd8c7-3a9e-4e62-842e-f26f40d978ca-console-serving-cert\") pod \"console-58f94ddcc-l6rch\" (UID: \"56bfd8c7-3a9e-4e62-842e-f26f40d978ca\") " pod="openshift-console/console-58f94ddcc-l6rch" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.525468 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/56bfd8c7-3a9e-4e62-842e-f26f40d978ca-service-ca\") pod \"console-58f94ddcc-l6rch\" (UID: \"56bfd8c7-3a9e-4e62-842e-f26f40d978ca\") " pod="openshift-console/console-58f94ddcc-l6rch" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.527380 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/56bfd8c7-3a9e-4e62-842e-f26f40d978ca-oauth-serving-cert\") pod \"console-58f94ddcc-l6rch\" (UID: \"56bfd8c7-3a9e-4e62-842e-f26f40d978ca\") " pod="openshift-console/console-58f94ddcc-l6rch" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.527394 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/56bfd8c7-3a9e-4e62-842e-f26f40d978ca-service-ca\") pod \"console-58f94ddcc-l6rch\" (UID: \"56bfd8c7-3a9e-4e62-842e-f26f40d978ca\") " pod="openshift-console/console-58f94ddcc-l6rch" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.527505 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56bfd8c7-3a9e-4e62-842e-f26f40d978ca-trusted-ca-bundle\") pod \"console-58f94ddcc-l6rch\" (UID: \"56bfd8c7-3a9e-4e62-842e-f26f40d978ca\") " pod="openshift-console/console-58f94ddcc-l6rch" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.528077 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/56bfd8c7-3a9e-4e62-842e-f26f40d978ca-console-config\") pod \"console-58f94ddcc-l6rch\" (UID: \"56bfd8c7-3a9e-4e62-842e-f26f40d978ca\") " pod="openshift-console/console-58f94ddcc-l6rch" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.530316 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/56bfd8c7-3a9e-4e62-842e-f26f40d978ca-console-serving-cert\") pod \"console-58f94ddcc-l6rch\" (UID: \"56bfd8c7-3a9e-4e62-842e-f26f40d978ca\") " pod="openshift-console/console-58f94ddcc-l6rch" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.530872 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/56bfd8c7-3a9e-4e62-842e-f26f40d978ca-console-oauth-config\") pod \"console-58f94ddcc-l6rch\" (UID: \"56bfd8c7-3a9e-4e62-842e-f26f40d978ca\") " pod="openshift-console/console-58f94ddcc-l6rch" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.539932 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5775\" (UniqueName: \"kubernetes.io/projected/56bfd8c7-3a9e-4e62-842e-f26f40d978ca-kube-api-access-l5775\") pod \"console-58f94ddcc-l6rch\" (UID: \"56bfd8c7-3a9e-4e62-842e-f26f40d978ca\") " pod="openshift-console/console-58f94ddcc-l6rch" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.626167 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/0d107f7a-e965-41e9-8ceb-5c5ac1c3b530-tls-key-pair\") pod \"nmstate-webhook-6b89b748d8-7qwpr\" (UID: \"0d107f7a-e965-41e9-8ceb-5c5ac1c3b530\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-7qwpr" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.629504 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/0d107f7a-e965-41e9-8ceb-5c5ac1c3b530-tls-key-pair\") pod \"nmstate-webhook-6b89b748d8-7qwpr\" (UID: \"0d107f7a-e965-41e9-8ceb-5c5ac1c3b530\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-7qwpr" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.661778 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58f94ddcc-l6rch" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.727006 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/55b4d3a4-d0be-4184-b9ee-efedf1c27608-plugin-serving-cert\") pod \"nmstate-console-plugin-5874bd7bc5-r7xcp\" (UID: \"55b4d3a4-d0be-4184-b9ee-efedf1c27608\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-r7xcp" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.731112 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/55b4d3a4-d0be-4184-b9ee-efedf1c27608-plugin-serving-cert\") pod \"nmstate-console-plugin-5874bd7bc5-r7xcp\" (UID: \"55b4d3a4-d0be-4184-b9ee-efedf1c27608\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-r7xcp" Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.802095 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-429gd" event={"ID":"558d67eb-e4e4-46ca-bc65-8e4d568f4037","Type":"ContainerStarted","Data":"8d57f3c55ac99b916701822b13da4f50dc1493ce11e45a66348dc418b6e20eb7"} Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.803125 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-r67mk" event={"ID":"0e1f9c73-89fc-4ab2-aca3-004315167c79","Type":"ContainerStarted","Data":"8ba3d552779c5333b2903f3967e19b390b5dafe07de45f4722844c4aa48686ae"} Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.854851 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-58f94ddcc-l6rch"] Nov 22 04:19:56 crc kubenswrapper[4699]: W1122 04:19:56.861065 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56bfd8c7_3a9e_4e62_842e_f26f40d978ca.slice/crio-53859d662e97c683cbf8e484d8f79afd8df01c33e0901ba64ff3727857e653ef WatchSource:0}: Error finding container 53859d662e97c683cbf8e484d8f79afd8df01c33e0901ba64ff3727857e653ef: Status 404 returned error can't find the container with id 53859d662e97c683cbf8e484d8f79afd8df01c33e0901ba64ff3727857e653ef Nov 22 04:19:56 crc kubenswrapper[4699]: I1122 04:19:56.897032 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-7qwpr" Nov 22 04:19:57 crc kubenswrapper[4699]: I1122 04:19:57.009196 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-r7xcp" Nov 22 04:19:57 crc kubenswrapper[4699]: I1122 04:19:57.142333 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6b89b748d8-7qwpr"] Nov 22 04:19:57 crc kubenswrapper[4699]: I1122 04:19:57.242233 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5874bd7bc5-r7xcp"] Nov 22 04:19:57 crc kubenswrapper[4699]: W1122 04:19:57.248157 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55b4d3a4_d0be_4184_b9ee_efedf1c27608.slice/crio-ef88b4401c9cd725173640260c991133789d6f8e65783bfdf6857a1ad9692400 WatchSource:0}: Error finding container ef88b4401c9cd725173640260c991133789d6f8e65783bfdf6857a1ad9692400: Status 404 returned error can't find the container with id ef88b4401c9cd725173640260c991133789d6f8e65783bfdf6857a1ad9692400 Nov 22 04:19:57 crc kubenswrapper[4699]: I1122 04:19:57.810362 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-r7xcp" event={"ID":"55b4d3a4-d0be-4184-b9ee-efedf1c27608","Type":"ContainerStarted","Data":"ef88b4401c9cd725173640260c991133789d6f8e65783bfdf6857a1ad9692400"} Nov 22 04:19:57 crc kubenswrapper[4699]: I1122 04:19:57.812168 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-7qwpr" event={"ID":"0d107f7a-e965-41e9-8ceb-5c5ac1c3b530","Type":"ContainerStarted","Data":"16938109c765bce4ea10ee02fadda7466db6eac064c4f07630abb26b76aa9c32"} Nov 22 04:19:57 crc kubenswrapper[4699]: I1122 04:19:57.813950 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58f94ddcc-l6rch" event={"ID":"56bfd8c7-3a9e-4e62-842e-f26f40d978ca","Type":"ContainerStarted","Data":"66e3649b07bb5ba2d9a560a2bce111ad1b7f250ceee6f7ff32f6e89254160483"} Nov 22 04:19:57 crc kubenswrapper[4699]: I1122 04:19:57.813977 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58f94ddcc-l6rch" event={"ID":"56bfd8c7-3a9e-4e62-842e-f26f40d978ca","Type":"ContainerStarted","Data":"53859d662e97c683cbf8e484d8f79afd8df01c33e0901ba64ff3727857e653ef"} Nov 22 04:19:57 crc kubenswrapper[4699]: I1122 04:19:57.837131 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-58f94ddcc-l6rch" podStartSLOduration=1.8371132449999998 podStartE2EDuration="1.837113245s" podCreationTimestamp="2025-11-22 04:19:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:19:57.833100926 +0000 UTC m=+749.175722133" watchObservedRunningTime="2025-11-22 04:19:57.837113245 +0000 UTC m=+749.179734432" Nov 22 04:20:01 crc kubenswrapper[4699]: I1122 04:20:01.834566 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-7qwpr" event={"ID":"0d107f7a-e965-41e9-8ceb-5c5ac1c3b530","Type":"ContainerStarted","Data":"7f2668d2a13fc03148981eb687a742184c130dfa577b3b0069a556e556b61b53"} Nov 22 04:20:01 crc kubenswrapper[4699]: I1122 04:20:01.835152 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-7qwpr" Nov 22 04:20:01 crc kubenswrapper[4699]: I1122 04:20:01.837671 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-r67mk" event={"ID":"0e1f9c73-89fc-4ab2-aca3-004315167c79","Type":"ContainerStarted","Data":"fc3c1a98af9f73e439d592c7f0ece72b5c629efd69ad0d8adac64e4c7e2a0cf0"} Nov 22 04:20:01 crc kubenswrapper[4699]: I1122 04:20:01.839914 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-429gd" event={"ID":"558d67eb-e4e4-46ca-bc65-8e4d568f4037","Type":"ContainerStarted","Data":"7d3fa99cc6a6303f855f4c9fdde2b8abc465afcf338036c8e9b5ed1d542a0b8f"} Nov 22 04:20:01 crc kubenswrapper[4699]: I1122 04:20:01.840035 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-429gd" Nov 22 04:20:01 crc kubenswrapper[4699]: I1122 04:20:01.857727 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-7qwpr" podStartSLOduration=3.00581248 podStartE2EDuration="6.857706125s" podCreationTimestamp="2025-11-22 04:19:55 +0000 UTC" firstStartedPulling="2025-11-22 04:19:57.153814737 +0000 UTC m=+748.496435924" lastFinishedPulling="2025-11-22 04:20:01.005708382 +0000 UTC m=+752.348329569" observedRunningTime="2025-11-22 04:20:01.855377678 +0000 UTC m=+753.197998885" watchObservedRunningTime="2025-11-22 04:20:01.857706125 +0000 UTC m=+753.200327312" Nov 22 04:20:01 crc kubenswrapper[4699]: I1122 04:20:01.872967 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-429gd" podStartSLOduration=2.242540257 podStartE2EDuration="6.87294517s" podCreationTimestamp="2025-11-22 04:19:55 +0000 UTC" firstStartedPulling="2025-11-22 04:19:56.362037384 +0000 UTC m=+747.704658561" lastFinishedPulling="2025-11-22 04:20:00.992442277 +0000 UTC m=+752.335063474" observedRunningTime="2025-11-22 04:20:01.870336786 +0000 UTC m=+753.212957983" watchObservedRunningTime="2025-11-22 04:20:01.87294517 +0000 UTC m=+753.215566357" Nov 22 04:20:02 crc kubenswrapper[4699]: I1122 04:20:02.845517 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-r7xcp" event={"ID":"55b4d3a4-d0be-4184-b9ee-efedf1c27608","Type":"ContainerStarted","Data":"3f0af3afa9a25b4c78c1a76e8c33a3fe91737e763c3d866d3275463d1718c341"} Nov 22 04:20:02 crc kubenswrapper[4699]: I1122 04:20:02.865803 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-r7xcp" podStartSLOduration=1.969001894 podStartE2EDuration="6.865781591s" podCreationTimestamp="2025-11-22 04:19:56 +0000 UTC" firstStartedPulling="2025-11-22 04:19:57.251403705 +0000 UTC m=+748.594024892" lastFinishedPulling="2025-11-22 04:20:02.148183402 +0000 UTC m=+753.490804589" observedRunningTime="2025-11-22 04:20:02.858198915 +0000 UTC m=+754.200820122" watchObservedRunningTime="2025-11-22 04:20:02.865781591 +0000 UTC m=+754.208402778" Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.141947 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9pj89"] Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.142706 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-9pj89" podUID="28796a4d-fde0-4f6e-9a06-8f72bfba6473" containerName="controller-manager" containerID="cri-o://d6076f5eed758ad66c1ec53ec9f795e616bd4faff09a688d0616ab682adf5eaa" gracePeriod=30 Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.286735 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rpts"] Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.287332 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rpts" podUID="a2f1bf82-73fb-4dcc-82e1-7d521ad29241" containerName="route-controller-manager" containerID="cri-o://a70b9dfd0581f75ac83ce4633fd0ec96f0c6437b20115e9124ea6335ab1f4388" gracePeriod=30 Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.522306 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9pj89" Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.558516 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgmrz\" (UniqueName: \"kubernetes.io/projected/28796a4d-fde0-4f6e-9a06-8f72bfba6473-kube-api-access-rgmrz\") pod \"28796a4d-fde0-4f6e-9a06-8f72bfba6473\" (UID: \"28796a4d-fde0-4f6e-9a06-8f72bfba6473\") " Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.558609 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28796a4d-fde0-4f6e-9a06-8f72bfba6473-proxy-ca-bundles\") pod \"28796a4d-fde0-4f6e-9a06-8f72bfba6473\" (UID: \"28796a4d-fde0-4f6e-9a06-8f72bfba6473\") " Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.558646 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28796a4d-fde0-4f6e-9a06-8f72bfba6473-config\") pod \"28796a4d-fde0-4f6e-9a06-8f72bfba6473\" (UID: \"28796a4d-fde0-4f6e-9a06-8f72bfba6473\") " Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.558672 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28796a4d-fde0-4f6e-9a06-8f72bfba6473-client-ca\") pod \"28796a4d-fde0-4f6e-9a06-8f72bfba6473\" (UID: \"28796a4d-fde0-4f6e-9a06-8f72bfba6473\") " Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.558700 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28796a4d-fde0-4f6e-9a06-8f72bfba6473-serving-cert\") pod \"28796a4d-fde0-4f6e-9a06-8f72bfba6473\" (UID: \"28796a4d-fde0-4f6e-9a06-8f72bfba6473\") " Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.559413 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28796a4d-fde0-4f6e-9a06-8f72bfba6473-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "28796a4d-fde0-4f6e-9a06-8f72bfba6473" (UID: "28796a4d-fde0-4f6e-9a06-8f72bfba6473"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.560049 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28796a4d-fde0-4f6e-9a06-8f72bfba6473-config" (OuterVolumeSpecName: "config") pod "28796a4d-fde0-4f6e-9a06-8f72bfba6473" (UID: "28796a4d-fde0-4f6e-9a06-8f72bfba6473"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.560072 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28796a4d-fde0-4f6e-9a06-8f72bfba6473-client-ca" (OuterVolumeSpecName: "client-ca") pod "28796a4d-fde0-4f6e-9a06-8f72bfba6473" (UID: "28796a4d-fde0-4f6e-9a06-8f72bfba6473"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.565687 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28796a4d-fde0-4f6e-9a06-8f72bfba6473-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "28796a4d-fde0-4f6e-9a06-8f72bfba6473" (UID: "28796a4d-fde0-4f6e-9a06-8f72bfba6473"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.565812 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28796a4d-fde0-4f6e-9a06-8f72bfba6473-kube-api-access-rgmrz" (OuterVolumeSpecName: "kube-api-access-rgmrz") pod "28796a4d-fde0-4f6e-9a06-8f72bfba6473" (UID: "28796a4d-fde0-4f6e-9a06-8f72bfba6473"). InnerVolumeSpecName "kube-api-access-rgmrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.620534 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rpts" Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.660019 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2f1bf82-73fb-4dcc-82e1-7d521ad29241-serving-cert\") pod \"a2f1bf82-73fb-4dcc-82e1-7d521ad29241\" (UID: \"a2f1bf82-73fb-4dcc-82e1-7d521ad29241\") " Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.660074 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8mxk\" (UniqueName: \"kubernetes.io/projected/a2f1bf82-73fb-4dcc-82e1-7d521ad29241-kube-api-access-w8mxk\") pod \"a2f1bf82-73fb-4dcc-82e1-7d521ad29241\" (UID: \"a2f1bf82-73fb-4dcc-82e1-7d521ad29241\") " Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.660091 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2f1bf82-73fb-4dcc-82e1-7d521ad29241-config\") pod \"a2f1bf82-73fb-4dcc-82e1-7d521ad29241\" (UID: \"a2f1bf82-73fb-4dcc-82e1-7d521ad29241\") " Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.660211 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a2f1bf82-73fb-4dcc-82e1-7d521ad29241-client-ca\") pod \"a2f1bf82-73fb-4dcc-82e1-7d521ad29241\" (UID: \"a2f1bf82-73fb-4dcc-82e1-7d521ad29241\") " Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.660391 4699 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28796a4d-fde0-4f6e-9a06-8f72bfba6473-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.660404 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28796a4d-fde0-4f6e-9a06-8f72bfba6473-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.660413 4699 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28796a4d-fde0-4f6e-9a06-8f72bfba6473-client-ca\") on node \"crc\" DevicePath \"\"" Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.660421 4699 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28796a4d-fde0-4f6e-9a06-8f72bfba6473-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.660448 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgmrz\" (UniqueName: \"kubernetes.io/projected/28796a4d-fde0-4f6e-9a06-8f72bfba6473-kube-api-access-rgmrz\") on node \"crc\" DevicePath \"\"" Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.661080 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2f1bf82-73fb-4dcc-82e1-7d521ad29241-client-ca" (OuterVolumeSpecName: "client-ca") pod "a2f1bf82-73fb-4dcc-82e1-7d521ad29241" (UID: "a2f1bf82-73fb-4dcc-82e1-7d521ad29241"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.661154 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2f1bf82-73fb-4dcc-82e1-7d521ad29241-config" (OuterVolumeSpecName: "config") pod "a2f1bf82-73fb-4dcc-82e1-7d521ad29241" (UID: "a2f1bf82-73fb-4dcc-82e1-7d521ad29241"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.663836 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2f1bf82-73fb-4dcc-82e1-7d521ad29241-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a2f1bf82-73fb-4dcc-82e1-7d521ad29241" (UID: "a2f1bf82-73fb-4dcc-82e1-7d521ad29241"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.664863 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2f1bf82-73fb-4dcc-82e1-7d521ad29241-kube-api-access-w8mxk" (OuterVolumeSpecName: "kube-api-access-w8mxk") pod "a2f1bf82-73fb-4dcc-82e1-7d521ad29241" (UID: "a2f1bf82-73fb-4dcc-82e1-7d521ad29241"). InnerVolumeSpecName "kube-api-access-w8mxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.761271 4699 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a2f1bf82-73fb-4dcc-82e1-7d521ad29241-client-ca\") on node \"crc\" DevicePath \"\"" Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.761319 4699 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2f1bf82-73fb-4dcc-82e1-7d521ad29241-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.761333 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8mxk\" (UniqueName: \"kubernetes.io/projected/a2f1bf82-73fb-4dcc-82e1-7d521ad29241-kube-api-access-w8mxk\") on node \"crc\" DevicePath \"\"" Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.761346 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2f1bf82-73fb-4dcc-82e1-7d521ad29241-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.857838 4699 generic.go:334] "Generic (PLEG): container finished" podID="28796a4d-fde0-4f6e-9a06-8f72bfba6473" containerID="d6076f5eed758ad66c1ec53ec9f795e616bd4faff09a688d0616ab682adf5eaa" exitCode=0 Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.857898 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9pj89" Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.857920 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9pj89" event={"ID":"28796a4d-fde0-4f6e-9a06-8f72bfba6473","Type":"ContainerDied","Data":"d6076f5eed758ad66c1ec53ec9f795e616bd4faff09a688d0616ab682adf5eaa"} Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.857950 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9pj89" event={"ID":"28796a4d-fde0-4f6e-9a06-8f72bfba6473","Type":"ContainerDied","Data":"aa4ba7e5c7cbac23694859423f3e1be56924428b111f64096cc385ae4a082a92"} Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.857970 4699 scope.go:117] "RemoveContainer" containerID="d6076f5eed758ad66c1ec53ec9f795e616bd4faff09a688d0616ab682adf5eaa" Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.861416 4699 generic.go:334] "Generic (PLEG): container finished" podID="a2f1bf82-73fb-4dcc-82e1-7d521ad29241" containerID="a70b9dfd0581f75ac83ce4633fd0ec96f0c6437b20115e9124ea6335ab1f4388" exitCode=0 Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.861490 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rpts" Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.861947 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rpts" event={"ID":"a2f1bf82-73fb-4dcc-82e1-7d521ad29241","Type":"ContainerDied","Data":"a70b9dfd0581f75ac83ce4633fd0ec96f0c6437b20115e9124ea6335ab1f4388"} Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.862001 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rpts" event={"ID":"a2f1bf82-73fb-4dcc-82e1-7d521ad29241","Type":"ContainerDied","Data":"364f481d4dac3795b089009abcb0db3aafb910e97c2580bb39aacdba90c9ea8b"} Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.864885 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-r67mk" event={"ID":"0e1f9c73-89fc-4ab2-aca3-004315167c79","Type":"ContainerStarted","Data":"74da2fa4712f330c19cbb5d4a09c2defc536b0adfdfb1190a6a57f0ef22140fe"} Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.874689 4699 scope.go:117] "RemoveContainer" containerID="d6076f5eed758ad66c1ec53ec9f795e616bd4faff09a688d0616ab682adf5eaa" Nov 22 04:20:04 crc kubenswrapper[4699]: E1122 04:20:04.876935 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6076f5eed758ad66c1ec53ec9f795e616bd4faff09a688d0616ab682adf5eaa\": container with ID starting with d6076f5eed758ad66c1ec53ec9f795e616bd4faff09a688d0616ab682adf5eaa not found: ID does not exist" containerID="d6076f5eed758ad66c1ec53ec9f795e616bd4faff09a688d0616ab682adf5eaa" Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.876987 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6076f5eed758ad66c1ec53ec9f795e616bd4faff09a688d0616ab682adf5eaa"} err="failed to get container status \"d6076f5eed758ad66c1ec53ec9f795e616bd4faff09a688d0616ab682adf5eaa\": rpc error: code = NotFound desc = could not find container \"d6076f5eed758ad66c1ec53ec9f795e616bd4faff09a688d0616ab682adf5eaa\": container with ID starting with d6076f5eed758ad66c1ec53ec9f795e616bd4faff09a688d0616ab682adf5eaa not found: ID does not exist" Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.877011 4699 scope.go:117] "RemoveContainer" containerID="a70b9dfd0581f75ac83ce4633fd0ec96f0c6437b20115e9124ea6335ab1f4388" Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.884462 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-r67mk" podStartSLOduration=2.354534669 podStartE2EDuration="9.884422717s" podCreationTimestamp="2025-11-22 04:19:55 +0000 UTC" firstStartedPulling="2025-11-22 04:19:56.523621264 +0000 UTC m=+747.866242451" lastFinishedPulling="2025-11-22 04:20:04.053509312 +0000 UTC m=+755.396130499" observedRunningTime="2025-11-22 04:20:04.879830764 +0000 UTC m=+756.222451981" watchObservedRunningTime="2025-11-22 04:20:04.884422717 +0000 UTC m=+756.227043904" Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.896233 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rpts"] Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.896787 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9rpts"] Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.906523 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9pj89"] Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.909832 4699 scope.go:117] "RemoveContainer" containerID="a70b9dfd0581f75ac83ce4633fd0ec96f0c6437b20115e9124ea6335ab1f4388" Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.913824 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9pj89"] Nov 22 04:20:04 crc kubenswrapper[4699]: E1122 04:20:04.921838 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a70b9dfd0581f75ac83ce4633fd0ec96f0c6437b20115e9124ea6335ab1f4388\": container with ID starting with a70b9dfd0581f75ac83ce4633fd0ec96f0c6437b20115e9124ea6335ab1f4388 not found: ID does not exist" containerID="a70b9dfd0581f75ac83ce4633fd0ec96f0c6437b20115e9124ea6335ab1f4388" Nov 22 04:20:04 crc kubenswrapper[4699]: I1122 04:20:04.921888 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a70b9dfd0581f75ac83ce4633fd0ec96f0c6437b20115e9124ea6335ab1f4388"} err="failed to get container status \"a70b9dfd0581f75ac83ce4633fd0ec96f0c6437b20115e9124ea6335ab1f4388\": rpc error: code = NotFound desc = could not find container \"a70b9dfd0581f75ac83ce4633fd0ec96f0c6437b20115e9124ea6335ab1f4388\": container with ID starting with a70b9dfd0581f75ac83ce4633fd0ec96f0c6437b20115e9124ea6335ab1f4388 not found: ID does not exist" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.350152 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-644bf4ddfd-5s5nd"] Nov 22 04:20:05 crc kubenswrapper[4699]: E1122 04:20:05.350447 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2f1bf82-73fb-4dcc-82e1-7d521ad29241" containerName="route-controller-manager" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.350460 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2f1bf82-73fb-4dcc-82e1-7d521ad29241" containerName="route-controller-manager" Nov 22 04:20:05 crc kubenswrapper[4699]: E1122 04:20:05.350473 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28796a4d-fde0-4f6e-9a06-8f72bfba6473" containerName="controller-manager" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.350478 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="28796a4d-fde0-4f6e-9a06-8f72bfba6473" containerName="controller-manager" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.350607 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="28796a4d-fde0-4f6e-9a06-8f72bfba6473" containerName="controller-manager" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.350618 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2f1bf82-73fb-4dcc-82e1-7d521ad29241" containerName="route-controller-manager" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.351001 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-644bf4ddfd-5s5nd" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.355286 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.355417 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.355494 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.355656 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.355656 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.355689 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-67548675f7-ng52g"] Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.355685 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.357958 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67548675f7-ng52g" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.362193 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-644bf4ddfd-5s5nd"] Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.367781 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.367788 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.367929 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.368080 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.368386 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.368612 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.368813 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a1d7c96-88a5-4852-809b-9a77f07da51d-client-ca\") pod \"route-controller-manager-644bf4ddfd-5s5nd\" (UID: \"7a1d7c96-88a5-4852-809b-9a77f07da51d\") " pod="openshift-route-controller-manager/route-controller-manager-644bf4ddfd-5s5nd" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.368946 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6885c78c-b6e6-4503-b61d-fd722060d0f3-client-ca\") pod \"controller-manager-67548675f7-ng52g\" (UID: \"6885c78c-b6e6-4503-b61d-fd722060d0f3\") " pod="openshift-controller-manager/controller-manager-67548675f7-ng52g" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.369071 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6885c78c-b6e6-4503-b61d-fd722060d0f3-serving-cert\") pod \"controller-manager-67548675f7-ng52g\" (UID: \"6885c78c-b6e6-4503-b61d-fd722060d0f3\") " pod="openshift-controller-manager/controller-manager-67548675f7-ng52g" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.369103 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6885c78c-b6e6-4503-b61d-fd722060d0f3-config\") pod \"controller-manager-67548675f7-ng52g\" (UID: \"6885c78c-b6e6-4503-b61d-fd722060d0f3\") " pod="openshift-controller-manager/controller-manager-67548675f7-ng52g" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.369128 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a1d7c96-88a5-4852-809b-9a77f07da51d-serving-cert\") pod \"route-controller-manager-644bf4ddfd-5s5nd\" (UID: \"7a1d7c96-88a5-4852-809b-9a77f07da51d\") " pod="openshift-route-controller-manager/route-controller-manager-644bf4ddfd-5s5nd" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.369154 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nls4l\" (UniqueName: \"kubernetes.io/projected/6885c78c-b6e6-4503-b61d-fd722060d0f3-kube-api-access-nls4l\") pod \"controller-manager-67548675f7-ng52g\" (UID: \"6885c78c-b6e6-4503-b61d-fd722060d0f3\") " pod="openshift-controller-manager/controller-manager-67548675f7-ng52g" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.369203 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a1d7c96-88a5-4852-809b-9a77f07da51d-config\") pod \"route-controller-manager-644bf4ddfd-5s5nd\" (UID: \"7a1d7c96-88a5-4852-809b-9a77f07da51d\") " pod="openshift-route-controller-manager/route-controller-manager-644bf4ddfd-5s5nd" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.369228 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6885c78c-b6e6-4503-b61d-fd722060d0f3-proxy-ca-bundles\") pod \"controller-manager-67548675f7-ng52g\" (UID: \"6885c78c-b6e6-4503-b61d-fd722060d0f3\") " pod="openshift-controller-manager/controller-manager-67548675f7-ng52g" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.369272 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxgjl\" (UniqueName: \"kubernetes.io/projected/7a1d7c96-88a5-4852-809b-9a77f07da51d-kube-api-access-vxgjl\") pod \"route-controller-manager-644bf4ddfd-5s5nd\" (UID: \"7a1d7c96-88a5-4852-809b-9a77f07da51d\") " pod="openshift-route-controller-manager/route-controller-manager-644bf4ddfd-5s5nd" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.370001 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-67548675f7-ng52g"] Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.375694 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.465780 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28796a4d-fde0-4f6e-9a06-8f72bfba6473" path="/var/lib/kubelet/pods/28796a4d-fde0-4f6e-9a06-8f72bfba6473/volumes" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.466702 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2f1bf82-73fb-4dcc-82e1-7d521ad29241" path="/var/lib/kubelet/pods/a2f1bf82-73fb-4dcc-82e1-7d521ad29241/volumes" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.470002 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a1d7c96-88a5-4852-809b-9a77f07da51d-config\") pod \"route-controller-manager-644bf4ddfd-5s5nd\" (UID: \"7a1d7c96-88a5-4852-809b-9a77f07da51d\") " pod="openshift-route-controller-manager/route-controller-manager-644bf4ddfd-5s5nd" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.470043 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6885c78c-b6e6-4503-b61d-fd722060d0f3-proxy-ca-bundles\") pod \"controller-manager-67548675f7-ng52g\" (UID: \"6885c78c-b6e6-4503-b61d-fd722060d0f3\") " pod="openshift-controller-manager/controller-manager-67548675f7-ng52g" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.470085 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxgjl\" (UniqueName: \"kubernetes.io/projected/7a1d7c96-88a5-4852-809b-9a77f07da51d-kube-api-access-vxgjl\") pod \"route-controller-manager-644bf4ddfd-5s5nd\" (UID: \"7a1d7c96-88a5-4852-809b-9a77f07da51d\") " pod="openshift-route-controller-manager/route-controller-manager-644bf4ddfd-5s5nd" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.470120 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a1d7c96-88a5-4852-809b-9a77f07da51d-client-ca\") pod \"route-controller-manager-644bf4ddfd-5s5nd\" (UID: \"7a1d7c96-88a5-4852-809b-9a77f07da51d\") " pod="openshift-route-controller-manager/route-controller-manager-644bf4ddfd-5s5nd" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.470162 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6885c78c-b6e6-4503-b61d-fd722060d0f3-client-ca\") pod \"controller-manager-67548675f7-ng52g\" (UID: \"6885c78c-b6e6-4503-b61d-fd722060d0f3\") " pod="openshift-controller-manager/controller-manager-67548675f7-ng52g" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.470204 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6885c78c-b6e6-4503-b61d-fd722060d0f3-serving-cert\") pod \"controller-manager-67548675f7-ng52g\" (UID: \"6885c78c-b6e6-4503-b61d-fd722060d0f3\") " pod="openshift-controller-manager/controller-manager-67548675f7-ng52g" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.470226 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6885c78c-b6e6-4503-b61d-fd722060d0f3-config\") pod \"controller-manager-67548675f7-ng52g\" (UID: \"6885c78c-b6e6-4503-b61d-fd722060d0f3\") " pod="openshift-controller-manager/controller-manager-67548675f7-ng52g" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.470248 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a1d7c96-88a5-4852-809b-9a77f07da51d-serving-cert\") pod \"route-controller-manager-644bf4ddfd-5s5nd\" (UID: \"7a1d7c96-88a5-4852-809b-9a77f07da51d\") " pod="openshift-route-controller-manager/route-controller-manager-644bf4ddfd-5s5nd" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.470273 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nls4l\" (UniqueName: \"kubernetes.io/projected/6885c78c-b6e6-4503-b61d-fd722060d0f3-kube-api-access-nls4l\") pod \"controller-manager-67548675f7-ng52g\" (UID: \"6885c78c-b6e6-4503-b61d-fd722060d0f3\") " pod="openshift-controller-manager/controller-manager-67548675f7-ng52g" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.471242 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a1d7c96-88a5-4852-809b-9a77f07da51d-config\") pod \"route-controller-manager-644bf4ddfd-5s5nd\" (UID: \"7a1d7c96-88a5-4852-809b-9a77f07da51d\") " pod="openshift-route-controller-manager/route-controller-manager-644bf4ddfd-5s5nd" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.471775 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a1d7c96-88a5-4852-809b-9a77f07da51d-client-ca\") pod \"route-controller-manager-644bf4ddfd-5s5nd\" (UID: \"7a1d7c96-88a5-4852-809b-9a77f07da51d\") " pod="openshift-route-controller-manager/route-controller-manager-644bf4ddfd-5s5nd" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.473038 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6885c78c-b6e6-4503-b61d-fd722060d0f3-config\") pod \"controller-manager-67548675f7-ng52g\" (UID: \"6885c78c-b6e6-4503-b61d-fd722060d0f3\") " pod="openshift-controller-manager/controller-manager-67548675f7-ng52g" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.473837 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6885c78c-b6e6-4503-b61d-fd722060d0f3-proxy-ca-bundles\") pod \"controller-manager-67548675f7-ng52g\" (UID: \"6885c78c-b6e6-4503-b61d-fd722060d0f3\") " pod="openshift-controller-manager/controller-manager-67548675f7-ng52g" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.475048 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6885c78c-b6e6-4503-b61d-fd722060d0f3-client-ca\") pod \"controller-manager-67548675f7-ng52g\" (UID: \"6885c78c-b6e6-4503-b61d-fd722060d0f3\") " pod="openshift-controller-manager/controller-manager-67548675f7-ng52g" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.476123 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a1d7c96-88a5-4852-809b-9a77f07da51d-serving-cert\") pod \"route-controller-manager-644bf4ddfd-5s5nd\" (UID: \"7a1d7c96-88a5-4852-809b-9a77f07da51d\") " pod="openshift-route-controller-manager/route-controller-manager-644bf4ddfd-5s5nd" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.476319 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6885c78c-b6e6-4503-b61d-fd722060d0f3-serving-cert\") pod \"controller-manager-67548675f7-ng52g\" (UID: \"6885c78c-b6e6-4503-b61d-fd722060d0f3\") " pod="openshift-controller-manager/controller-manager-67548675f7-ng52g" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.486060 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxgjl\" (UniqueName: \"kubernetes.io/projected/7a1d7c96-88a5-4852-809b-9a77f07da51d-kube-api-access-vxgjl\") pod \"route-controller-manager-644bf4ddfd-5s5nd\" (UID: \"7a1d7c96-88a5-4852-809b-9a77f07da51d\") " pod="openshift-route-controller-manager/route-controller-manager-644bf4ddfd-5s5nd" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.486792 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nls4l\" (UniqueName: \"kubernetes.io/projected/6885c78c-b6e6-4503-b61d-fd722060d0f3-kube-api-access-nls4l\") pod \"controller-manager-67548675f7-ng52g\" (UID: \"6885c78c-b6e6-4503-b61d-fd722060d0f3\") " pod="openshift-controller-manager/controller-manager-67548675f7-ng52g" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.676136 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-644bf4ddfd-5s5nd" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.682417 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67548675f7-ng52g" Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.922228 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-644bf4ddfd-5s5nd"] Nov 22 04:20:05 crc kubenswrapper[4699]: I1122 04:20:05.952814 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-67548675f7-ng52g"] Nov 22 04:20:05 crc kubenswrapper[4699]: W1122 04:20:05.962003 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6885c78c_b6e6_4503_b61d_fd722060d0f3.slice/crio-656037c54d292d0e7296e099e227819fe1f453d936250c1ed86febac3a6dcd46 WatchSource:0}: Error finding container 656037c54d292d0e7296e099e227819fe1f453d936250c1ed86febac3a6dcd46: Status 404 returned error can't find the container with id 656037c54d292d0e7296e099e227819fe1f453d936250c1ed86febac3a6dcd46 Nov 22 04:20:06 crc kubenswrapper[4699]: I1122 04:20:06.351447 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-429gd" Nov 22 04:20:06 crc kubenswrapper[4699]: I1122 04:20:06.662366 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-58f94ddcc-l6rch" Nov 22 04:20:06 crc kubenswrapper[4699]: I1122 04:20:06.662444 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-58f94ddcc-l6rch" Nov 22 04:20:06 crc kubenswrapper[4699]: I1122 04:20:06.666382 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-58f94ddcc-l6rch" Nov 22 04:20:06 crc kubenswrapper[4699]: I1122 04:20:06.892918 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-644bf4ddfd-5s5nd" event={"ID":"7a1d7c96-88a5-4852-809b-9a77f07da51d","Type":"ContainerStarted","Data":"60e0a229e553a39f75cbdc97da8051d9429e0b68a0aef5f8e48ebd7b252947b2"} Nov 22 04:20:06 crc kubenswrapper[4699]: I1122 04:20:06.892969 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-644bf4ddfd-5s5nd" event={"ID":"7a1d7c96-88a5-4852-809b-9a77f07da51d","Type":"ContainerStarted","Data":"56d5066fff24bb517b192b8de39f8da84687bbb3a3ab66990725fb8cabc62ead"} Nov 22 04:20:06 crc kubenswrapper[4699]: I1122 04:20:06.895540 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-644bf4ddfd-5s5nd" Nov 22 04:20:06 crc kubenswrapper[4699]: I1122 04:20:06.897838 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67548675f7-ng52g" event={"ID":"6885c78c-b6e6-4503-b61d-fd722060d0f3","Type":"ContainerStarted","Data":"d6e0f9ff29c0c91f92724eb2bfed243b8c3547616cffa36c8b1cc1a83ef839f9"} Nov 22 04:20:06 crc kubenswrapper[4699]: I1122 04:20:06.897868 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67548675f7-ng52g" event={"ID":"6885c78c-b6e6-4503-b61d-fd722060d0f3","Type":"ContainerStarted","Data":"656037c54d292d0e7296e099e227819fe1f453d936250c1ed86febac3a6dcd46"} Nov 22 04:20:06 crc kubenswrapper[4699]: I1122 04:20:06.898190 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-67548675f7-ng52g" Nov 22 04:20:06 crc kubenswrapper[4699]: I1122 04:20:06.899233 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-644bf4ddfd-5s5nd" Nov 22 04:20:06 crc kubenswrapper[4699]: I1122 04:20:06.901469 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-58f94ddcc-l6rch" Nov 22 04:20:06 crc kubenswrapper[4699]: I1122 04:20:06.903112 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-67548675f7-ng52g" Nov 22 04:20:06 crc kubenswrapper[4699]: I1122 04:20:06.914588 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-644bf4ddfd-5s5nd" podStartSLOduration=2.914572995 podStartE2EDuration="2.914572995s" podCreationTimestamp="2025-11-22 04:20:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:20:06.910490634 +0000 UTC m=+758.253111831" watchObservedRunningTime="2025-11-22 04:20:06.914572995 +0000 UTC m=+758.257194172" Nov 22 04:20:06 crc kubenswrapper[4699]: I1122 04:20:06.931841 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-67548675f7-ng52g" podStartSLOduration=2.931822168 podStartE2EDuration="2.931822168s" podCreationTimestamp="2025-11-22 04:20:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:20:06.928102187 +0000 UTC m=+758.270723384" watchObservedRunningTime="2025-11-22 04:20:06.931822168 +0000 UTC m=+758.274443355" Nov 22 04:20:06 crc kubenswrapper[4699]: I1122 04:20:06.978389 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-9sbmb"] Nov 22 04:20:13 crc kubenswrapper[4699]: I1122 04:20:13.225458 4699 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 22 04:20:16 crc kubenswrapper[4699]: I1122 04:20:16.903970 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-7qwpr" Nov 22 04:20:28 crc kubenswrapper[4699]: I1122 04:20:28.470960 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6mmt7g"] Nov 22 04:20:28 crc kubenswrapper[4699]: I1122 04:20:28.472964 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6mmt7g" Nov 22 04:20:28 crc kubenswrapper[4699]: I1122 04:20:28.475173 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 22 04:20:28 crc kubenswrapper[4699]: I1122 04:20:28.527565 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6mmt7g"] Nov 22 04:20:28 crc kubenswrapper[4699]: I1122 04:20:28.584803 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/46bd6dfa-3553-4128-9412-a6d995e86f82-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6mmt7g\" (UID: \"46bd6dfa-3553-4128-9412-a6d995e86f82\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6mmt7g" Nov 22 04:20:28 crc kubenswrapper[4699]: I1122 04:20:28.585227 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/46bd6dfa-3553-4128-9412-a6d995e86f82-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6mmt7g\" (UID: \"46bd6dfa-3553-4128-9412-a6d995e86f82\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6mmt7g" Nov 22 04:20:28 crc kubenswrapper[4699]: I1122 04:20:28.585382 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8j6p\" (UniqueName: \"kubernetes.io/projected/46bd6dfa-3553-4128-9412-a6d995e86f82-kube-api-access-b8j6p\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6mmt7g\" (UID: \"46bd6dfa-3553-4128-9412-a6d995e86f82\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6mmt7g" Nov 22 04:20:28 crc kubenswrapper[4699]: I1122 04:20:28.686730 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/46bd6dfa-3553-4128-9412-a6d995e86f82-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6mmt7g\" (UID: \"46bd6dfa-3553-4128-9412-a6d995e86f82\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6mmt7g" Nov 22 04:20:28 crc kubenswrapper[4699]: I1122 04:20:28.687151 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8j6p\" (UniqueName: \"kubernetes.io/projected/46bd6dfa-3553-4128-9412-a6d995e86f82-kube-api-access-b8j6p\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6mmt7g\" (UID: \"46bd6dfa-3553-4128-9412-a6d995e86f82\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6mmt7g" Nov 22 04:20:28 crc kubenswrapper[4699]: I1122 04:20:28.687232 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/46bd6dfa-3553-4128-9412-a6d995e86f82-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6mmt7g\" (UID: \"46bd6dfa-3553-4128-9412-a6d995e86f82\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6mmt7g" Nov 22 04:20:28 crc kubenswrapper[4699]: I1122 04:20:28.687817 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/46bd6dfa-3553-4128-9412-a6d995e86f82-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6mmt7g\" (UID: \"46bd6dfa-3553-4128-9412-a6d995e86f82\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6mmt7g" Nov 22 04:20:28 crc kubenswrapper[4699]: I1122 04:20:28.687831 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/46bd6dfa-3553-4128-9412-a6d995e86f82-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6mmt7g\" (UID: \"46bd6dfa-3553-4128-9412-a6d995e86f82\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6mmt7g" Nov 22 04:20:28 crc kubenswrapper[4699]: I1122 04:20:28.711403 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8j6p\" (UniqueName: \"kubernetes.io/projected/46bd6dfa-3553-4128-9412-a6d995e86f82-kube-api-access-b8j6p\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6mmt7g\" (UID: \"46bd6dfa-3553-4128-9412-a6d995e86f82\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6mmt7g" Nov 22 04:20:28 crc kubenswrapper[4699]: I1122 04:20:28.830696 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6mmt7g" Nov 22 04:20:30 crc kubenswrapper[4699]: I1122 04:20:30.288390 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6mmt7g"] Nov 22 04:20:30 crc kubenswrapper[4699]: I1122 04:20:30.798193 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5ksqz"] Nov 22 04:20:30 crc kubenswrapper[4699]: I1122 04:20:30.799690 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5ksqz" Nov 22 04:20:30 crc kubenswrapper[4699]: I1122 04:20:30.849991 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c90bb50-76e5-4b07-ba5c-4307f461e0bd-catalog-content\") pod \"redhat-operators-5ksqz\" (UID: \"0c90bb50-76e5-4b07-ba5c-4307f461e0bd\") " pod="openshift-marketplace/redhat-operators-5ksqz" Nov 22 04:20:30 crc kubenswrapper[4699]: I1122 04:20:30.850051 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c90bb50-76e5-4b07-ba5c-4307f461e0bd-utilities\") pod \"redhat-operators-5ksqz\" (UID: \"0c90bb50-76e5-4b07-ba5c-4307f461e0bd\") " pod="openshift-marketplace/redhat-operators-5ksqz" Nov 22 04:20:30 crc kubenswrapper[4699]: I1122 04:20:30.850134 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhwjn\" (UniqueName: \"kubernetes.io/projected/0c90bb50-76e5-4b07-ba5c-4307f461e0bd-kube-api-access-lhwjn\") pod \"redhat-operators-5ksqz\" (UID: \"0c90bb50-76e5-4b07-ba5c-4307f461e0bd\") " pod="openshift-marketplace/redhat-operators-5ksqz" Nov 22 04:20:30 crc kubenswrapper[4699]: I1122 04:20:30.854014 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5ksqz"] Nov 22 04:20:30 crc kubenswrapper[4699]: I1122 04:20:30.950992 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c90bb50-76e5-4b07-ba5c-4307f461e0bd-catalog-content\") pod \"redhat-operators-5ksqz\" (UID: \"0c90bb50-76e5-4b07-ba5c-4307f461e0bd\") " pod="openshift-marketplace/redhat-operators-5ksqz" Nov 22 04:20:30 crc kubenswrapper[4699]: I1122 04:20:30.951046 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c90bb50-76e5-4b07-ba5c-4307f461e0bd-utilities\") pod \"redhat-operators-5ksqz\" (UID: \"0c90bb50-76e5-4b07-ba5c-4307f461e0bd\") " pod="openshift-marketplace/redhat-operators-5ksqz" Nov 22 04:20:30 crc kubenswrapper[4699]: I1122 04:20:30.951115 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhwjn\" (UniqueName: \"kubernetes.io/projected/0c90bb50-76e5-4b07-ba5c-4307f461e0bd-kube-api-access-lhwjn\") pod \"redhat-operators-5ksqz\" (UID: \"0c90bb50-76e5-4b07-ba5c-4307f461e0bd\") " pod="openshift-marketplace/redhat-operators-5ksqz" Nov 22 04:20:30 crc kubenswrapper[4699]: I1122 04:20:30.951656 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c90bb50-76e5-4b07-ba5c-4307f461e0bd-catalog-content\") pod \"redhat-operators-5ksqz\" (UID: \"0c90bb50-76e5-4b07-ba5c-4307f461e0bd\") " pod="openshift-marketplace/redhat-operators-5ksqz" Nov 22 04:20:30 crc kubenswrapper[4699]: I1122 04:20:30.951834 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c90bb50-76e5-4b07-ba5c-4307f461e0bd-utilities\") pod \"redhat-operators-5ksqz\" (UID: \"0c90bb50-76e5-4b07-ba5c-4307f461e0bd\") " pod="openshift-marketplace/redhat-operators-5ksqz" Nov 22 04:20:30 crc kubenswrapper[4699]: I1122 04:20:30.971657 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhwjn\" (UniqueName: \"kubernetes.io/projected/0c90bb50-76e5-4b07-ba5c-4307f461e0bd-kube-api-access-lhwjn\") pod \"redhat-operators-5ksqz\" (UID: \"0c90bb50-76e5-4b07-ba5c-4307f461e0bd\") " pod="openshift-marketplace/redhat-operators-5ksqz" Nov 22 04:20:31 crc kubenswrapper[4699]: I1122 04:20:31.023784 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6mmt7g" event={"ID":"46bd6dfa-3553-4128-9412-a6d995e86f82","Type":"ContainerStarted","Data":"bcb2984f6c79bc7ebb0cec1c7b71e056317133c179ada71435c5dc8ad1351c80"} Nov 22 04:20:31 crc kubenswrapper[4699]: I1122 04:20:31.164097 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5ksqz" Nov 22 04:20:31 crc kubenswrapper[4699]: I1122 04:20:31.696837 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5ksqz"] Nov 22 04:20:32 crc kubenswrapper[4699]: I1122 04:20:32.017258 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-9sbmb" podUID="c108dbbb-24af-45d9-a01f-cadab889f225" containerName="console" containerID="cri-o://ae10abcbeb546a531e55346e8f09ec2bc5b329a992f1a29d0dfdf6f5cc5ed9d2" gracePeriod=15 Nov 22 04:20:32 crc kubenswrapper[4699]: I1122 04:20:32.030761 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5ksqz" event={"ID":"0c90bb50-76e5-4b07-ba5c-4307f461e0bd","Type":"ContainerStarted","Data":"824ee76c3c76d73df20286f0b2aef49db3987d551e34c67b5952f07dc1ebf912"} Nov 22 04:20:33 crc kubenswrapper[4699]: I1122 04:20:33.037977 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-9sbmb_c108dbbb-24af-45d9-a01f-cadab889f225/console/0.log" Nov 22 04:20:33 crc kubenswrapper[4699]: I1122 04:20:33.038059 4699 generic.go:334] "Generic (PLEG): container finished" podID="c108dbbb-24af-45d9-a01f-cadab889f225" containerID="ae10abcbeb546a531e55346e8f09ec2bc5b329a992f1a29d0dfdf6f5cc5ed9d2" exitCode=2 Nov 22 04:20:33 crc kubenswrapper[4699]: I1122 04:20:33.038143 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-9sbmb" event={"ID":"c108dbbb-24af-45d9-a01f-cadab889f225","Type":"ContainerDied","Data":"ae10abcbeb546a531e55346e8f09ec2bc5b329a992f1a29d0dfdf6f5cc5ed9d2"} Nov 22 04:20:33 crc kubenswrapper[4699]: I1122 04:20:33.039523 4699 generic.go:334] "Generic (PLEG): container finished" podID="0c90bb50-76e5-4b07-ba5c-4307f461e0bd" containerID="8dc889cdf591c18d5ef8ae33435955a1ace86b6402c770c0cba3fd9c1ede4e58" exitCode=0 Nov 22 04:20:33 crc kubenswrapper[4699]: I1122 04:20:33.039618 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5ksqz" event={"ID":"0c90bb50-76e5-4b07-ba5c-4307f461e0bd","Type":"ContainerDied","Data":"8dc889cdf591c18d5ef8ae33435955a1ace86b6402c770c0cba3fd9c1ede4e58"} Nov 22 04:20:33 crc kubenswrapper[4699]: I1122 04:20:33.040695 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6mmt7g" event={"ID":"46bd6dfa-3553-4128-9412-a6d995e86f82","Type":"ContainerStarted","Data":"b6ef7552cbb79119ea7ce017542dbd345e8f50aa6036cf3647dc12f2d061e9f8"} Nov 22 04:20:33 crc kubenswrapper[4699]: I1122 04:20:33.433894 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-9sbmb_c108dbbb-24af-45d9-a01f-cadab889f225/console/0.log" Nov 22 04:20:33 crc kubenswrapper[4699]: I1122 04:20:33.433982 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-9sbmb" Nov 22 04:20:33 crc kubenswrapper[4699]: I1122 04:20:33.592765 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c108dbbb-24af-45d9-a01f-cadab889f225-console-config\") pod \"c108dbbb-24af-45d9-a01f-cadab889f225\" (UID: \"c108dbbb-24af-45d9-a01f-cadab889f225\") " Nov 22 04:20:33 crc kubenswrapper[4699]: I1122 04:20:33.593125 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94hm7\" (UniqueName: \"kubernetes.io/projected/c108dbbb-24af-45d9-a01f-cadab889f225-kube-api-access-94hm7\") pod \"c108dbbb-24af-45d9-a01f-cadab889f225\" (UID: \"c108dbbb-24af-45d9-a01f-cadab889f225\") " Nov 22 04:20:33 crc kubenswrapper[4699]: I1122 04:20:33.593177 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c108dbbb-24af-45d9-a01f-cadab889f225-console-serving-cert\") pod \"c108dbbb-24af-45d9-a01f-cadab889f225\" (UID: \"c108dbbb-24af-45d9-a01f-cadab889f225\") " Nov 22 04:20:33 crc kubenswrapper[4699]: I1122 04:20:33.593203 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c108dbbb-24af-45d9-a01f-cadab889f225-oauth-serving-cert\") pod \"c108dbbb-24af-45d9-a01f-cadab889f225\" (UID: \"c108dbbb-24af-45d9-a01f-cadab889f225\") " Nov 22 04:20:33 crc kubenswrapper[4699]: I1122 04:20:33.593277 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c108dbbb-24af-45d9-a01f-cadab889f225-console-oauth-config\") pod \"c108dbbb-24af-45d9-a01f-cadab889f225\" (UID: \"c108dbbb-24af-45d9-a01f-cadab889f225\") " Nov 22 04:20:33 crc kubenswrapper[4699]: I1122 04:20:33.593299 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c108dbbb-24af-45d9-a01f-cadab889f225-trusted-ca-bundle\") pod \"c108dbbb-24af-45d9-a01f-cadab889f225\" (UID: \"c108dbbb-24af-45d9-a01f-cadab889f225\") " Nov 22 04:20:33 crc kubenswrapper[4699]: I1122 04:20:33.593378 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c108dbbb-24af-45d9-a01f-cadab889f225-service-ca\") pod \"c108dbbb-24af-45d9-a01f-cadab889f225\" (UID: \"c108dbbb-24af-45d9-a01f-cadab889f225\") " Nov 22 04:20:33 crc kubenswrapper[4699]: I1122 04:20:33.594165 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c108dbbb-24af-45d9-a01f-cadab889f225-service-ca" (OuterVolumeSpecName: "service-ca") pod "c108dbbb-24af-45d9-a01f-cadab889f225" (UID: "c108dbbb-24af-45d9-a01f-cadab889f225"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:20:33 crc kubenswrapper[4699]: I1122 04:20:33.594601 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c108dbbb-24af-45d9-a01f-cadab889f225-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c108dbbb-24af-45d9-a01f-cadab889f225" (UID: "c108dbbb-24af-45d9-a01f-cadab889f225"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:20:33 crc kubenswrapper[4699]: I1122 04:20:33.594975 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c108dbbb-24af-45d9-a01f-cadab889f225-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c108dbbb-24af-45d9-a01f-cadab889f225" (UID: "c108dbbb-24af-45d9-a01f-cadab889f225"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:20:33 crc kubenswrapper[4699]: I1122 04:20:33.595064 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c108dbbb-24af-45d9-a01f-cadab889f225-console-config" (OuterVolumeSpecName: "console-config") pod "c108dbbb-24af-45d9-a01f-cadab889f225" (UID: "c108dbbb-24af-45d9-a01f-cadab889f225"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:20:33 crc kubenswrapper[4699]: I1122 04:20:33.599238 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c108dbbb-24af-45d9-a01f-cadab889f225-kube-api-access-94hm7" (OuterVolumeSpecName: "kube-api-access-94hm7") pod "c108dbbb-24af-45d9-a01f-cadab889f225" (UID: "c108dbbb-24af-45d9-a01f-cadab889f225"). InnerVolumeSpecName "kube-api-access-94hm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:20:33 crc kubenswrapper[4699]: I1122 04:20:33.599341 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c108dbbb-24af-45d9-a01f-cadab889f225-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c108dbbb-24af-45d9-a01f-cadab889f225" (UID: "c108dbbb-24af-45d9-a01f-cadab889f225"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:20:33 crc kubenswrapper[4699]: I1122 04:20:33.600178 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c108dbbb-24af-45d9-a01f-cadab889f225-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c108dbbb-24af-45d9-a01f-cadab889f225" (UID: "c108dbbb-24af-45d9-a01f-cadab889f225"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:20:33 crc kubenswrapper[4699]: I1122 04:20:33.694888 4699 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c108dbbb-24af-45d9-a01f-cadab889f225-console-oauth-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:20:33 crc kubenswrapper[4699]: I1122 04:20:33.694921 4699 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c108dbbb-24af-45d9-a01f-cadab889f225-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:20:33 crc kubenswrapper[4699]: I1122 04:20:33.694932 4699 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c108dbbb-24af-45d9-a01f-cadab889f225-service-ca\") on node \"crc\" DevicePath \"\"" Nov 22 04:20:33 crc kubenswrapper[4699]: I1122 04:20:33.694940 4699 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c108dbbb-24af-45d9-a01f-cadab889f225-console-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:20:33 crc kubenswrapper[4699]: I1122 04:20:33.694948 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94hm7\" (UniqueName: \"kubernetes.io/projected/c108dbbb-24af-45d9-a01f-cadab889f225-kube-api-access-94hm7\") on node \"crc\" DevicePath \"\"" Nov 22 04:20:33 crc kubenswrapper[4699]: I1122 04:20:33.694956 4699 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c108dbbb-24af-45d9-a01f-cadab889f225-console-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:20:33 crc kubenswrapper[4699]: I1122 04:20:33.694964 4699 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c108dbbb-24af-45d9-a01f-cadab889f225-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 04:20:34 crc kubenswrapper[4699]: I1122 04:20:34.048060 4699 generic.go:334] "Generic (PLEG): container finished" podID="46bd6dfa-3553-4128-9412-a6d995e86f82" containerID="b6ef7552cbb79119ea7ce017542dbd345e8f50aa6036cf3647dc12f2d061e9f8" exitCode=0 Nov 22 04:20:34 crc kubenswrapper[4699]: I1122 04:20:34.048138 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6mmt7g" event={"ID":"46bd6dfa-3553-4128-9412-a6d995e86f82","Type":"ContainerDied","Data":"b6ef7552cbb79119ea7ce017542dbd345e8f50aa6036cf3647dc12f2d061e9f8"} Nov 22 04:20:34 crc kubenswrapper[4699]: I1122 04:20:34.050428 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-9sbmb_c108dbbb-24af-45d9-a01f-cadab889f225/console/0.log" Nov 22 04:20:34 crc kubenswrapper[4699]: I1122 04:20:34.051116 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-9sbmb" Nov 22 04:20:34 crc kubenswrapper[4699]: I1122 04:20:34.051349 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-9sbmb" event={"ID":"c108dbbb-24af-45d9-a01f-cadab889f225","Type":"ContainerDied","Data":"cae099890687377c90c6b3e1e0be3842cbcb38f09e86dc592576b566c8c06eba"} Nov 22 04:20:34 crc kubenswrapper[4699]: I1122 04:20:34.051401 4699 scope.go:117] "RemoveContainer" containerID="ae10abcbeb546a531e55346e8f09ec2bc5b329a992f1a29d0dfdf6f5cc5ed9d2" Nov 22 04:20:34 crc kubenswrapper[4699]: I1122 04:20:34.105507 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-9sbmb"] Nov 22 04:20:34 crc kubenswrapper[4699]: I1122 04:20:34.110844 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-9sbmb"] Nov 22 04:20:35 crc kubenswrapper[4699]: I1122 04:20:35.462697 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c108dbbb-24af-45d9-a01f-cadab889f225" path="/var/lib/kubelet/pods/c108dbbb-24af-45d9-a01f-cadab889f225/volumes" Nov 22 04:20:37 crc kubenswrapper[4699]: I1122 04:20:37.069183 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5ksqz" event={"ID":"0c90bb50-76e5-4b07-ba5c-4307f461e0bd","Type":"ContainerDied","Data":"dc1aef7d23e0f687b4e04bab8661f82ea96bd00aba5cfd00dd06ab15805624ba"} Nov 22 04:20:37 crc kubenswrapper[4699]: I1122 04:20:37.069127 4699 generic.go:334] "Generic (PLEG): container finished" podID="0c90bb50-76e5-4b07-ba5c-4307f461e0bd" containerID="dc1aef7d23e0f687b4e04bab8661f82ea96bd00aba5cfd00dd06ab15805624ba" exitCode=0 Nov 22 04:20:37 crc kubenswrapper[4699]: I1122 04:20:37.072627 4699 generic.go:334] "Generic (PLEG): container finished" podID="46bd6dfa-3553-4128-9412-a6d995e86f82" containerID="208986205d98fad55d19c8f557ac943f6e083610dfbe4e2bacb3c8c658ebb5cd" exitCode=0 Nov 22 04:20:37 crc kubenswrapper[4699]: I1122 04:20:37.072667 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6mmt7g" event={"ID":"46bd6dfa-3553-4128-9412-a6d995e86f82","Type":"ContainerDied","Data":"208986205d98fad55d19c8f557ac943f6e083610dfbe4e2bacb3c8c658ebb5cd"} Nov 22 04:20:38 crc kubenswrapper[4699]: I1122 04:20:38.081874 4699 generic.go:334] "Generic (PLEG): container finished" podID="46bd6dfa-3553-4128-9412-a6d995e86f82" containerID="ace0feee981db2bb8a3049a325b72b7489b7eef0dd47d2c29337eef9466cd6c8" exitCode=0 Nov 22 04:20:38 crc kubenswrapper[4699]: I1122 04:20:38.081930 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6mmt7g" event={"ID":"46bd6dfa-3553-4128-9412-a6d995e86f82","Type":"ContainerDied","Data":"ace0feee981db2bb8a3049a325b72b7489b7eef0dd47d2c29337eef9466cd6c8"} Nov 22 04:20:38 crc kubenswrapper[4699]: I1122 04:20:38.084801 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5ksqz" event={"ID":"0c90bb50-76e5-4b07-ba5c-4307f461e0bd","Type":"ContainerStarted","Data":"6f18f1a73a84e9963c2ff4b3e12021424abcc3a426f43f7907f67ee4e00956ce"} Nov 22 04:20:38 crc kubenswrapper[4699]: I1122 04:20:38.127919 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5ksqz" podStartSLOduration=4.458187845 podStartE2EDuration="8.12785369s" podCreationTimestamp="2025-11-22 04:20:30 +0000 UTC" firstStartedPulling="2025-11-22 04:20:34.05204458 +0000 UTC m=+785.394665767" lastFinishedPulling="2025-11-22 04:20:37.721710425 +0000 UTC m=+789.064331612" observedRunningTime="2025-11-22 04:20:38.119208848 +0000 UTC m=+789.461830045" watchObservedRunningTime="2025-11-22 04:20:38.12785369 +0000 UTC m=+789.470474907" Nov 22 04:20:38 crc kubenswrapper[4699]: I1122 04:20:38.725967 4699 patch_prober.go:28] interesting pod/machine-config-daemon-kjwnt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:20:38 crc kubenswrapper[4699]: I1122 04:20:38.726054 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:20:39 crc kubenswrapper[4699]: I1122 04:20:39.418089 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6mmt7g" Nov 22 04:20:39 crc kubenswrapper[4699]: I1122 04:20:39.576772 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/46bd6dfa-3553-4128-9412-a6d995e86f82-bundle\") pod \"46bd6dfa-3553-4128-9412-a6d995e86f82\" (UID: \"46bd6dfa-3553-4128-9412-a6d995e86f82\") " Nov 22 04:20:39 crc kubenswrapper[4699]: I1122 04:20:39.576894 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/46bd6dfa-3553-4128-9412-a6d995e86f82-util\") pod \"46bd6dfa-3553-4128-9412-a6d995e86f82\" (UID: \"46bd6dfa-3553-4128-9412-a6d995e86f82\") " Nov 22 04:20:39 crc kubenswrapper[4699]: I1122 04:20:39.576973 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8j6p\" (UniqueName: \"kubernetes.io/projected/46bd6dfa-3553-4128-9412-a6d995e86f82-kube-api-access-b8j6p\") pod \"46bd6dfa-3553-4128-9412-a6d995e86f82\" (UID: \"46bd6dfa-3553-4128-9412-a6d995e86f82\") " Nov 22 04:20:39 crc kubenswrapper[4699]: I1122 04:20:39.578542 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46bd6dfa-3553-4128-9412-a6d995e86f82-bundle" (OuterVolumeSpecName: "bundle") pod "46bd6dfa-3553-4128-9412-a6d995e86f82" (UID: "46bd6dfa-3553-4128-9412-a6d995e86f82"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:20:39 crc kubenswrapper[4699]: I1122 04:20:39.582888 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46bd6dfa-3553-4128-9412-a6d995e86f82-kube-api-access-b8j6p" (OuterVolumeSpecName: "kube-api-access-b8j6p") pod "46bd6dfa-3553-4128-9412-a6d995e86f82" (UID: "46bd6dfa-3553-4128-9412-a6d995e86f82"). InnerVolumeSpecName "kube-api-access-b8j6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:20:39 crc kubenswrapper[4699]: I1122 04:20:39.588393 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46bd6dfa-3553-4128-9412-a6d995e86f82-util" (OuterVolumeSpecName: "util") pod "46bd6dfa-3553-4128-9412-a6d995e86f82" (UID: "46bd6dfa-3553-4128-9412-a6d995e86f82"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:20:39 crc kubenswrapper[4699]: I1122 04:20:39.678638 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8j6p\" (UniqueName: \"kubernetes.io/projected/46bd6dfa-3553-4128-9412-a6d995e86f82-kube-api-access-b8j6p\") on node \"crc\" DevicePath \"\"" Nov 22 04:20:39 crc kubenswrapper[4699]: I1122 04:20:39.678681 4699 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/46bd6dfa-3553-4128-9412-a6d995e86f82-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:20:39 crc kubenswrapper[4699]: I1122 04:20:39.678693 4699 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/46bd6dfa-3553-4128-9412-a6d995e86f82-util\") on node \"crc\" DevicePath \"\"" Nov 22 04:20:40 crc kubenswrapper[4699]: I1122 04:20:40.097458 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6mmt7g" event={"ID":"46bd6dfa-3553-4128-9412-a6d995e86f82","Type":"ContainerDied","Data":"bcb2984f6c79bc7ebb0cec1c7b71e056317133c179ada71435c5dc8ad1351c80"} Nov 22 04:20:40 crc kubenswrapper[4699]: I1122 04:20:40.097830 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bcb2984f6c79bc7ebb0cec1c7b71e056317133c179ada71435c5dc8ad1351c80" Nov 22 04:20:40 crc kubenswrapper[4699]: I1122 04:20:40.097542 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6mmt7g" Nov 22 04:20:41 crc kubenswrapper[4699]: I1122 04:20:41.164782 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5ksqz" Nov 22 04:20:41 crc kubenswrapper[4699]: I1122 04:20:41.164841 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5ksqz" Nov 22 04:20:42 crc kubenswrapper[4699]: I1122 04:20:42.205105 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5ksqz" podUID="0c90bb50-76e5-4b07-ba5c-4307f461e0bd" containerName="registry-server" probeResult="failure" output=< Nov 22 04:20:42 crc kubenswrapper[4699]: timeout: failed to connect service ":50051" within 1s Nov 22 04:20:42 crc kubenswrapper[4699]: > Nov 22 04:20:51 crc kubenswrapper[4699]: I1122 04:20:51.219589 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5ksqz" Nov 22 04:20:51 crc kubenswrapper[4699]: I1122 04:20:51.271208 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5ksqz" Nov 22 04:20:51 crc kubenswrapper[4699]: I1122 04:20:51.520539 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7c65d8d687-6vpd9"] Nov 22 04:20:51 crc kubenswrapper[4699]: E1122 04:20:51.521086 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46bd6dfa-3553-4128-9412-a6d995e86f82" containerName="util" Nov 22 04:20:51 crc kubenswrapper[4699]: I1122 04:20:51.521107 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="46bd6dfa-3553-4128-9412-a6d995e86f82" containerName="util" Nov 22 04:20:51 crc kubenswrapper[4699]: E1122 04:20:51.521126 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c108dbbb-24af-45d9-a01f-cadab889f225" containerName="console" Nov 22 04:20:51 crc kubenswrapper[4699]: I1122 04:20:51.521339 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="c108dbbb-24af-45d9-a01f-cadab889f225" containerName="console" Nov 22 04:20:51 crc kubenswrapper[4699]: E1122 04:20:51.521353 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46bd6dfa-3553-4128-9412-a6d995e86f82" containerName="extract" Nov 22 04:20:51 crc kubenswrapper[4699]: I1122 04:20:51.521361 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="46bd6dfa-3553-4128-9412-a6d995e86f82" containerName="extract" Nov 22 04:20:51 crc kubenswrapper[4699]: E1122 04:20:51.521371 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46bd6dfa-3553-4128-9412-a6d995e86f82" containerName="pull" Nov 22 04:20:51 crc kubenswrapper[4699]: I1122 04:20:51.521378 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="46bd6dfa-3553-4128-9412-a6d995e86f82" containerName="pull" Nov 22 04:20:51 crc kubenswrapper[4699]: I1122 04:20:51.521544 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="c108dbbb-24af-45d9-a01f-cadab889f225" containerName="console" Nov 22 04:20:51 crc kubenswrapper[4699]: I1122 04:20:51.521568 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="46bd6dfa-3553-4128-9412-a6d995e86f82" containerName="extract" Nov 22 04:20:51 crc kubenswrapper[4699]: I1122 04:20:51.522056 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7c65d8d687-6vpd9" Nov 22 04:20:51 crc kubenswrapper[4699]: I1122 04:20:51.526057 4699 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Nov 22 04:20:51 crc kubenswrapper[4699]: I1122 04:20:51.526373 4699 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-zgkpr" Nov 22 04:20:51 crc kubenswrapper[4699]: I1122 04:20:51.526519 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Nov 22 04:20:51 crc kubenswrapper[4699]: I1122 04:20:51.529722 4699 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Nov 22 04:20:51 crc kubenswrapper[4699]: I1122 04:20:51.532068 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Nov 22 04:20:51 crc kubenswrapper[4699]: I1122 04:20:51.540487 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7c65d8d687-6vpd9"] Nov 22 04:20:51 crc kubenswrapper[4699]: I1122 04:20:51.630473 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pmvn\" (UniqueName: \"kubernetes.io/projected/f195708e-47e9-45a0-8361-7bbe6b6c6c0b-kube-api-access-7pmvn\") pod \"metallb-operator-controller-manager-7c65d8d687-6vpd9\" (UID: \"f195708e-47e9-45a0-8361-7bbe6b6c6c0b\") " pod="metallb-system/metallb-operator-controller-manager-7c65d8d687-6vpd9" Nov 22 04:20:51 crc kubenswrapper[4699]: I1122 04:20:51.630535 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f195708e-47e9-45a0-8361-7bbe6b6c6c0b-webhook-cert\") pod \"metallb-operator-controller-manager-7c65d8d687-6vpd9\" (UID: \"f195708e-47e9-45a0-8361-7bbe6b6c6c0b\") " pod="metallb-system/metallb-operator-controller-manager-7c65d8d687-6vpd9" Nov 22 04:20:51 crc kubenswrapper[4699]: I1122 04:20:51.630685 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f195708e-47e9-45a0-8361-7bbe6b6c6c0b-apiservice-cert\") pod \"metallb-operator-controller-manager-7c65d8d687-6vpd9\" (UID: \"f195708e-47e9-45a0-8361-7bbe6b6c6c0b\") " pod="metallb-system/metallb-operator-controller-manager-7c65d8d687-6vpd9" Nov 22 04:20:51 crc kubenswrapper[4699]: I1122 04:20:51.731944 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pmvn\" (UniqueName: \"kubernetes.io/projected/f195708e-47e9-45a0-8361-7bbe6b6c6c0b-kube-api-access-7pmvn\") pod \"metallb-operator-controller-manager-7c65d8d687-6vpd9\" (UID: \"f195708e-47e9-45a0-8361-7bbe6b6c6c0b\") " pod="metallb-system/metallb-operator-controller-manager-7c65d8d687-6vpd9" Nov 22 04:20:51 crc kubenswrapper[4699]: I1122 04:20:51.732007 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f195708e-47e9-45a0-8361-7bbe6b6c6c0b-webhook-cert\") pod \"metallb-operator-controller-manager-7c65d8d687-6vpd9\" (UID: \"f195708e-47e9-45a0-8361-7bbe6b6c6c0b\") " pod="metallb-system/metallb-operator-controller-manager-7c65d8d687-6vpd9" Nov 22 04:20:51 crc kubenswrapper[4699]: I1122 04:20:51.732040 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f195708e-47e9-45a0-8361-7bbe6b6c6c0b-apiservice-cert\") pod \"metallb-operator-controller-manager-7c65d8d687-6vpd9\" (UID: \"f195708e-47e9-45a0-8361-7bbe6b6c6c0b\") " pod="metallb-system/metallb-operator-controller-manager-7c65d8d687-6vpd9" Nov 22 04:20:51 crc kubenswrapper[4699]: I1122 04:20:51.743550 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f195708e-47e9-45a0-8361-7bbe6b6c6c0b-webhook-cert\") pod \"metallb-operator-controller-manager-7c65d8d687-6vpd9\" (UID: \"f195708e-47e9-45a0-8361-7bbe6b6c6c0b\") " pod="metallb-system/metallb-operator-controller-manager-7c65d8d687-6vpd9" Nov 22 04:20:51 crc kubenswrapper[4699]: I1122 04:20:51.743550 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f195708e-47e9-45a0-8361-7bbe6b6c6c0b-apiservice-cert\") pod \"metallb-operator-controller-manager-7c65d8d687-6vpd9\" (UID: \"f195708e-47e9-45a0-8361-7bbe6b6c6c0b\") " pod="metallb-system/metallb-operator-controller-manager-7c65d8d687-6vpd9" Nov 22 04:20:51 crc kubenswrapper[4699]: I1122 04:20:51.748938 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pmvn\" (UniqueName: \"kubernetes.io/projected/f195708e-47e9-45a0-8361-7bbe6b6c6c0b-kube-api-access-7pmvn\") pod \"metallb-operator-controller-manager-7c65d8d687-6vpd9\" (UID: \"f195708e-47e9-45a0-8361-7bbe6b6c6c0b\") " pod="metallb-system/metallb-operator-controller-manager-7c65d8d687-6vpd9" Nov 22 04:20:51 crc kubenswrapper[4699]: I1122 04:20:51.767665 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-785654ff4c-blk7v"] Nov 22 04:20:51 crc kubenswrapper[4699]: I1122 04:20:51.768503 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-785654ff4c-blk7v" Nov 22 04:20:51 crc kubenswrapper[4699]: I1122 04:20:51.770046 4699 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Nov 22 04:20:51 crc kubenswrapper[4699]: I1122 04:20:51.770287 4699 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-stnxw" Nov 22 04:20:51 crc kubenswrapper[4699]: I1122 04:20:51.770473 4699 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Nov 22 04:20:51 crc kubenswrapper[4699]: I1122 04:20:51.810920 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-785654ff4c-blk7v"] Nov 22 04:20:51 crc kubenswrapper[4699]: I1122 04:20:51.833410 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fbc128ff-e74b-44ee-a1a0-553a38bc79c7-webhook-cert\") pod \"metallb-operator-webhook-server-785654ff4c-blk7v\" (UID: \"fbc128ff-e74b-44ee-a1a0-553a38bc79c7\") " pod="metallb-system/metallb-operator-webhook-server-785654ff4c-blk7v" Nov 22 04:20:51 crc kubenswrapper[4699]: I1122 04:20:51.833531 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fbc128ff-e74b-44ee-a1a0-553a38bc79c7-apiservice-cert\") pod \"metallb-operator-webhook-server-785654ff4c-blk7v\" (UID: \"fbc128ff-e74b-44ee-a1a0-553a38bc79c7\") " pod="metallb-system/metallb-operator-webhook-server-785654ff4c-blk7v" Nov 22 04:20:51 crc kubenswrapper[4699]: I1122 04:20:51.833613 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wnkg\" (UniqueName: \"kubernetes.io/projected/fbc128ff-e74b-44ee-a1a0-553a38bc79c7-kube-api-access-5wnkg\") pod \"metallb-operator-webhook-server-785654ff4c-blk7v\" (UID: \"fbc128ff-e74b-44ee-a1a0-553a38bc79c7\") " pod="metallb-system/metallb-operator-webhook-server-785654ff4c-blk7v" Nov 22 04:20:51 crc kubenswrapper[4699]: I1122 04:20:51.843179 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7c65d8d687-6vpd9" Nov 22 04:20:51 crc kubenswrapper[4699]: I1122 04:20:51.934213 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fbc128ff-e74b-44ee-a1a0-553a38bc79c7-apiservice-cert\") pod \"metallb-operator-webhook-server-785654ff4c-blk7v\" (UID: \"fbc128ff-e74b-44ee-a1a0-553a38bc79c7\") " pod="metallb-system/metallb-operator-webhook-server-785654ff4c-blk7v" Nov 22 04:20:51 crc kubenswrapper[4699]: I1122 04:20:51.934295 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wnkg\" (UniqueName: \"kubernetes.io/projected/fbc128ff-e74b-44ee-a1a0-553a38bc79c7-kube-api-access-5wnkg\") pod \"metallb-operator-webhook-server-785654ff4c-blk7v\" (UID: \"fbc128ff-e74b-44ee-a1a0-553a38bc79c7\") " pod="metallb-system/metallb-operator-webhook-server-785654ff4c-blk7v" Nov 22 04:20:51 crc kubenswrapper[4699]: I1122 04:20:51.934331 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fbc128ff-e74b-44ee-a1a0-553a38bc79c7-webhook-cert\") pod \"metallb-operator-webhook-server-785654ff4c-blk7v\" (UID: \"fbc128ff-e74b-44ee-a1a0-553a38bc79c7\") " pod="metallb-system/metallb-operator-webhook-server-785654ff4c-blk7v" Nov 22 04:20:51 crc kubenswrapper[4699]: I1122 04:20:51.943320 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fbc128ff-e74b-44ee-a1a0-553a38bc79c7-apiservice-cert\") pod \"metallb-operator-webhook-server-785654ff4c-blk7v\" (UID: \"fbc128ff-e74b-44ee-a1a0-553a38bc79c7\") " pod="metallb-system/metallb-operator-webhook-server-785654ff4c-blk7v" Nov 22 04:20:51 crc kubenswrapper[4699]: I1122 04:20:51.950287 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fbc128ff-e74b-44ee-a1a0-553a38bc79c7-webhook-cert\") pod \"metallb-operator-webhook-server-785654ff4c-blk7v\" (UID: \"fbc128ff-e74b-44ee-a1a0-553a38bc79c7\") " pod="metallb-system/metallb-operator-webhook-server-785654ff4c-blk7v" Nov 22 04:20:51 crc kubenswrapper[4699]: I1122 04:20:51.953774 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wnkg\" (UniqueName: \"kubernetes.io/projected/fbc128ff-e74b-44ee-a1a0-553a38bc79c7-kube-api-access-5wnkg\") pod \"metallb-operator-webhook-server-785654ff4c-blk7v\" (UID: \"fbc128ff-e74b-44ee-a1a0-553a38bc79c7\") " pod="metallb-system/metallb-operator-webhook-server-785654ff4c-blk7v" Nov 22 04:20:52 crc kubenswrapper[4699]: I1122 04:20:52.112537 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-785654ff4c-blk7v" Nov 22 04:20:52 crc kubenswrapper[4699]: I1122 04:20:52.163876 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5ksqz"] Nov 22 04:20:52 crc kubenswrapper[4699]: I1122 04:20:52.346084 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7c65d8d687-6vpd9"] Nov 22 04:20:52 crc kubenswrapper[4699]: W1122 04:20:52.352748 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf195708e_47e9_45a0_8361_7bbe6b6c6c0b.slice/crio-d365f78fdd937c1bce1d85ed16392eeb88dc7db2b59125e566c463d39bbf5fe2 WatchSource:0}: Error finding container d365f78fdd937c1bce1d85ed16392eeb88dc7db2b59125e566c463d39bbf5fe2: Status 404 returned error can't find the container with id d365f78fdd937c1bce1d85ed16392eeb88dc7db2b59125e566c463d39bbf5fe2 Nov 22 04:20:52 crc kubenswrapper[4699]: I1122 04:20:52.546718 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-785654ff4c-blk7v"] Nov 22 04:20:52 crc kubenswrapper[4699]: W1122 04:20:52.552387 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfbc128ff_e74b_44ee_a1a0_553a38bc79c7.slice/crio-adad7f81b30b26764a8024437820efbf4aeb2d72e8c40343d34acf875d9a613c WatchSource:0}: Error finding container adad7f81b30b26764a8024437820efbf4aeb2d72e8c40343d34acf875d9a613c: Status 404 returned error can't find the container with id adad7f81b30b26764a8024437820efbf4aeb2d72e8c40343d34acf875d9a613c Nov 22 04:20:53 crc kubenswrapper[4699]: I1122 04:20:53.179795 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7c65d8d687-6vpd9" event={"ID":"f195708e-47e9-45a0-8361-7bbe6b6c6c0b","Type":"ContainerStarted","Data":"d365f78fdd937c1bce1d85ed16392eeb88dc7db2b59125e566c463d39bbf5fe2"} Nov 22 04:20:53 crc kubenswrapper[4699]: I1122 04:20:53.180764 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-785654ff4c-blk7v" event={"ID":"fbc128ff-e74b-44ee-a1a0-553a38bc79c7","Type":"ContainerStarted","Data":"adad7f81b30b26764a8024437820efbf4aeb2d72e8c40343d34acf875d9a613c"} Nov 22 04:20:53 crc kubenswrapper[4699]: I1122 04:20:53.180946 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5ksqz" podUID="0c90bb50-76e5-4b07-ba5c-4307f461e0bd" containerName="registry-server" containerID="cri-o://6f18f1a73a84e9963c2ff4b3e12021424abcc3a426f43f7907f67ee4e00956ce" gracePeriod=2 Nov 22 04:20:53 crc kubenswrapper[4699]: I1122 04:20:53.570460 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5ksqz" Nov 22 04:20:53 crc kubenswrapper[4699]: I1122 04:20:53.656797 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhwjn\" (UniqueName: \"kubernetes.io/projected/0c90bb50-76e5-4b07-ba5c-4307f461e0bd-kube-api-access-lhwjn\") pod \"0c90bb50-76e5-4b07-ba5c-4307f461e0bd\" (UID: \"0c90bb50-76e5-4b07-ba5c-4307f461e0bd\") " Nov 22 04:20:53 crc kubenswrapper[4699]: I1122 04:20:53.656906 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c90bb50-76e5-4b07-ba5c-4307f461e0bd-catalog-content\") pod \"0c90bb50-76e5-4b07-ba5c-4307f461e0bd\" (UID: \"0c90bb50-76e5-4b07-ba5c-4307f461e0bd\") " Nov 22 04:20:53 crc kubenswrapper[4699]: I1122 04:20:53.656934 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c90bb50-76e5-4b07-ba5c-4307f461e0bd-utilities\") pod \"0c90bb50-76e5-4b07-ba5c-4307f461e0bd\" (UID: \"0c90bb50-76e5-4b07-ba5c-4307f461e0bd\") " Nov 22 04:20:53 crc kubenswrapper[4699]: I1122 04:20:53.658185 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c90bb50-76e5-4b07-ba5c-4307f461e0bd-utilities" (OuterVolumeSpecName: "utilities") pod "0c90bb50-76e5-4b07-ba5c-4307f461e0bd" (UID: "0c90bb50-76e5-4b07-ba5c-4307f461e0bd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:20:53 crc kubenswrapper[4699]: I1122 04:20:53.667648 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c90bb50-76e5-4b07-ba5c-4307f461e0bd-kube-api-access-lhwjn" (OuterVolumeSpecName: "kube-api-access-lhwjn") pod "0c90bb50-76e5-4b07-ba5c-4307f461e0bd" (UID: "0c90bb50-76e5-4b07-ba5c-4307f461e0bd"). InnerVolumeSpecName "kube-api-access-lhwjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:20:53 crc kubenswrapper[4699]: I1122 04:20:53.758336 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhwjn\" (UniqueName: \"kubernetes.io/projected/0c90bb50-76e5-4b07-ba5c-4307f461e0bd-kube-api-access-lhwjn\") on node \"crc\" DevicePath \"\"" Nov 22 04:20:53 crc kubenswrapper[4699]: I1122 04:20:53.758386 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c90bb50-76e5-4b07-ba5c-4307f461e0bd-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:20:53 crc kubenswrapper[4699]: I1122 04:20:53.766237 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c90bb50-76e5-4b07-ba5c-4307f461e0bd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0c90bb50-76e5-4b07-ba5c-4307f461e0bd" (UID: "0c90bb50-76e5-4b07-ba5c-4307f461e0bd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:20:53 crc kubenswrapper[4699]: I1122 04:20:53.859238 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c90bb50-76e5-4b07-ba5c-4307f461e0bd-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:20:54 crc kubenswrapper[4699]: I1122 04:20:54.190902 4699 generic.go:334] "Generic (PLEG): container finished" podID="0c90bb50-76e5-4b07-ba5c-4307f461e0bd" containerID="6f18f1a73a84e9963c2ff4b3e12021424abcc3a426f43f7907f67ee4e00956ce" exitCode=0 Nov 22 04:20:54 crc kubenswrapper[4699]: I1122 04:20:54.190966 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5ksqz" event={"ID":"0c90bb50-76e5-4b07-ba5c-4307f461e0bd","Type":"ContainerDied","Data":"6f18f1a73a84e9963c2ff4b3e12021424abcc3a426f43f7907f67ee4e00956ce"} Nov 22 04:20:54 crc kubenswrapper[4699]: I1122 04:20:54.190999 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5ksqz" event={"ID":"0c90bb50-76e5-4b07-ba5c-4307f461e0bd","Type":"ContainerDied","Data":"824ee76c3c76d73df20286f0b2aef49db3987d551e34c67b5952f07dc1ebf912"} Nov 22 04:20:54 crc kubenswrapper[4699]: I1122 04:20:54.191016 4699 scope.go:117] "RemoveContainer" containerID="6f18f1a73a84e9963c2ff4b3e12021424abcc3a426f43f7907f67ee4e00956ce" Nov 22 04:20:54 crc kubenswrapper[4699]: I1122 04:20:54.191033 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5ksqz" Nov 22 04:20:54 crc kubenswrapper[4699]: I1122 04:20:54.240562 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5ksqz"] Nov 22 04:20:54 crc kubenswrapper[4699]: I1122 04:20:54.241794 4699 scope.go:117] "RemoveContainer" containerID="dc1aef7d23e0f687b4e04bab8661f82ea96bd00aba5cfd00dd06ab15805624ba" Nov 22 04:20:54 crc kubenswrapper[4699]: I1122 04:20:54.245822 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5ksqz"] Nov 22 04:20:54 crc kubenswrapper[4699]: I1122 04:20:54.275818 4699 scope.go:117] "RemoveContainer" containerID="8dc889cdf591c18d5ef8ae33435955a1ace86b6402c770c0cba3fd9c1ede4e58" Nov 22 04:20:54 crc kubenswrapper[4699]: I1122 04:20:54.289867 4699 scope.go:117] "RemoveContainer" containerID="6f18f1a73a84e9963c2ff4b3e12021424abcc3a426f43f7907f67ee4e00956ce" Nov 22 04:20:54 crc kubenswrapper[4699]: E1122 04:20:54.290230 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f18f1a73a84e9963c2ff4b3e12021424abcc3a426f43f7907f67ee4e00956ce\": container with ID starting with 6f18f1a73a84e9963c2ff4b3e12021424abcc3a426f43f7907f67ee4e00956ce not found: ID does not exist" containerID="6f18f1a73a84e9963c2ff4b3e12021424abcc3a426f43f7907f67ee4e00956ce" Nov 22 04:20:54 crc kubenswrapper[4699]: I1122 04:20:54.290256 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f18f1a73a84e9963c2ff4b3e12021424abcc3a426f43f7907f67ee4e00956ce"} err="failed to get container status \"6f18f1a73a84e9963c2ff4b3e12021424abcc3a426f43f7907f67ee4e00956ce\": rpc error: code = NotFound desc = could not find container \"6f18f1a73a84e9963c2ff4b3e12021424abcc3a426f43f7907f67ee4e00956ce\": container with ID starting with 6f18f1a73a84e9963c2ff4b3e12021424abcc3a426f43f7907f67ee4e00956ce not found: ID does not exist" Nov 22 04:20:54 crc kubenswrapper[4699]: I1122 04:20:54.290278 4699 scope.go:117] "RemoveContainer" containerID="dc1aef7d23e0f687b4e04bab8661f82ea96bd00aba5cfd00dd06ab15805624ba" Nov 22 04:20:54 crc kubenswrapper[4699]: E1122 04:20:54.290598 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc1aef7d23e0f687b4e04bab8661f82ea96bd00aba5cfd00dd06ab15805624ba\": container with ID starting with dc1aef7d23e0f687b4e04bab8661f82ea96bd00aba5cfd00dd06ab15805624ba not found: ID does not exist" containerID="dc1aef7d23e0f687b4e04bab8661f82ea96bd00aba5cfd00dd06ab15805624ba" Nov 22 04:20:54 crc kubenswrapper[4699]: I1122 04:20:54.290616 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc1aef7d23e0f687b4e04bab8661f82ea96bd00aba5cfd00dd06ab15805624ba"} err="failed to get container status \"dc1aef7d23e0f687b4e04bab8661f82ea96bd00aba5cfd00dd06ab15805624ba\": rpc error: code = NotFound desc = could not find container \"dc1aef7d23e0f687b4e04bab8661f82ea96bd00aba5cfd00dd06ab15805624ba\": container with ID starting with dc1aef7d23e0f687b4e04bab8661f82ea96bd00aba5cfd00dd06ab15805624ba not found: ID does not exist" Nov 22 04:20:54 crc kubenswrapper[4699]: I1122 04:20:54.290628 4699 scope.go:117] "RemoveContainer" containerID="8dc889cdf591c18d5ef8ae33435955a1ace86b6402c770c0cba3fd9c1ede4e58" Nov 22 04:20:54 crc kubenswrapper[4699]: E1122 04:20:54.292918 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8dc889cdf591c18d5ef8ae33435955a1ace86b6402c770c0cba3fd9c1ede4e58\": container with ID starting with 8dc889cdf591c18d5ef8ae33435955a1ace86b6402c770c0cba3fd9c1ede4e58 not found: ID does not exist" containerID="8dc889cdf591c18d5ef8ae33435955a1ace86b6402c770c0cba3fd9c1ede4e58" Nov 22 04:20:54 crc kubenswrapper[4699]: I1122 04:20:54.292942 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dc889cdf591c18d5ef8ae33435955a1ace86b6402c770c0cba3fd9c1ede4e58"} err="failed to get container status \"8dc889cdf591c18d5ef8ae33435955a1ace86b6402c770c0cba3fd9c1ede4e58\": rpc error: code = NotFound desc = could not find container \"8dc889cdf591c18d5ef8ae33435955a1ace86b6402c770c0cba3fd9c1ede4e58\": container with ID starting with 8dc889cdf591c18d5ef8ae33435955a1ace86b6402c770c0cba3fd9c1ede4e58 not found: ID does not exist" Nov 22 04:20:55 crc kubenswrapper[4699]: I1122 04:20:55.454819 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c90bb50-76e5-4b07-ba5c-4307f461e0bd" path="/var/lib/kubelet/pods/0c90bb50-76e5-4b07-ba5c-4307f461e0bd/volumes" Nov 22 04:20:57 crc kubenswrapper[4699]: I1122 04:20:57.220867 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-785654ff4c-blk7v" event={"ID":"fbc128ff-e74b-44ee-a1a0-553a38bc79c7","Type":"ContainerStarted","Data":"4e111b05d969c822e799166e1dc7ff663f1273347b3e9d803d36d3591b463ed4"} Nov 22 04:20:57 crc kubenswrapper[4699]: I1122 04:20:57.221374 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-785654ff4c-blk7v" Nov 22 04:20:57 crc kubenswrapper[4699]: I1122 04:20:57.222234 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7c65d8d687-6vpd9" event={"ID":"f195708e-47e9-45a0-8361-7bbe6b6c6c0b","Type":"ContainerStarted","Data":"c1bbe83ef6a526fc7a06942bf3c7e281ec6fddf73568f4b6ee326ee44215909a"} Nov 22 04:20:57 crc kubenswrapper[4699]: I1122 04:20:57.222548 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7c65d8d687-6vpd9" Nov 22 04:20:57 crc kubenswrapper[4699]: I1122 04:20:57.238359 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-785654ff4c-blk7v" podStartSLOduration=1.7818902620000001 podStartE2EDuration="6.238334332s" podCreationTimestamp="2025-11-22 04:20:51 +0000 UTC" firstStartedPulling="2025-11-22 04:20:52.555096033 +0000 UTC m=+803.897717210" lastFinishedPulling="2025-11-22 04:20:57.011540093 +0000 UTC m=+808.354161280" observedRunningTime="2025-11-22 04:20:57.236150169 +0000 UTC m=+808.578771366" watchObservedRunningTime="2025-11-22 04:20:57.238334332 +0000 UTC m=+808.580955519" Nov 22 04:20:57 crc kubenswrapper[4699]: I1122 04:20:57.272204 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7c65d8d687-6vpd9" podStartSLOduration=1.694250984 podStartE2EDuration="6.272187762s" podCreationTimestamp="2025-11-22 04:20:51 +0000 UTC" firstStartedPulling="2025-11-22 04:20:52.356078775 +0000 UTC m=+803.698699962" lastFinishedPulling="2025-11-22 04:20:56.934015553 +0000 UTC m=+808.276636740" observedRunningTime="2025-11-22 04:20:57.268964553 +0000 UTC m=+808.611585780" watchObservedRunningTime="2025-11-22 04:20:57.272187762 +0000 UTC m=+808.614808949" Nov 22 04:21:06 crc kubenswrapper[4699]: I1122 04:21:06.690892 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nnr9j"] Nov 22 04:21:06 crc kubenswrapper[4699]: E1122 04:21:06.691549 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c90bb50-76e5-4b07-ba5c-4307f461e0bd" containerName="extract-utilities" Nov 22 04:21:06 crc kubenswrapper[4699]: I1122 04:21:06.691561 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c90bb50-76e5-4b07-ba5c-4307f461e0bd" containerName="extract-utilities" Nov 22 04:21:06 crc kubenswrapper[4699]: E1122 04:21:06.691579 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c90bb50-76e5-4b07-ba5c-4307f461e0bd" containerName="registry-server" Nov 22 04:21:06 crc kubenswrapper[4699]: I1122 04:21:06.691586 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c90bb50-76e5-4b07-ba5c-4307f461e0bd" containerName="registry-server" Nov 22 04:21:06 crc kubenswrapper[4699]: E1122 04:21:06.691595 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c90bb50-76e5-4b07-ba5c-4307f461e0bd" containerName="extract-content" Nov 22 04:21:06 crc kubenswrapper[4699]: I1122 04:21:06.691601 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c90bb50-76e5-4b07-ba5c-4307f461e0bd" containerName="extract-content" Nov 22 04:21:06 crc kubenswrapper[4699]: I1122 04:21:06.691691 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c90bb50-76e5-4b07-ba5c-4307f461e0bd" containerName="registry-server" Nov 22 04:21:06 crc kubenswrapper[4699]: I1122 04:21:06.692396 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nnr9j" Nov 22 04:21:06 crc kubenswrapper[4699]: I1122 04:21:06.705556 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nnr9j"] Nov 22 04:21:06 crc kubenswrapper[4699]: I1122 04:21:06.750955 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6p8f\" (UniqueName: \"kubernetes.io/projected/aaecc794-ede4-47a1-9fe6-38523c93c942-kube-api-access-k6p8f\") pod \"redhat-marketplace-nnr9j\" (UID: \"aaecc794-ede4-47a1-9fe6-38523c93c942\") " pod="openshift-marketplace/redhat-marketplace-nnr9j" Nov 22 04:21:06 crc kubenswrapper[4699]: I1122 04:21:06.751041 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aaecc794-ede4-47a1-9fe6-38523c93c942-utilities\") pod \"redhat-marketplace-nnr9j\" (UID: \"aaecc794-ede4-47a1-9fe6-38523c93c942\") " pod="openshift-marketplace/redhat-marketplace-nnr9j" Nov 22 04:21:06 crc kubenswrapper[4699]: I1122 04:21:06.751104 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aaecc794-ede4-47a1-9fe6-38523c93c942-catalog-content\") pod \"redhat-marketplace-nnr9j\" (UID: \"aaecc794-ede4-47a1-9fe6-38523c93c942\") " pod="openshift-marketplace/redhat-marketplace-nnr9j" Nov 22 04:21:06 crc kubenswrapper[4699]: I1122 04:21:06.860536 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aaecc794-ede4-47a1-9fe6-38523c93c942-utilities\") pod \"redhat-marketplace-nnr9j\" (UID: \"aaecc794-ede4-47a1-9fe6-38523c93c942\") " pod="openshift-marketplace/redhat-marketplace-nnr9j" Nov 22 04:21:06 crc kubenswrapper[4699]: I1122 04:21:06.861283 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aaecc794-ede4-47a1-9fe6-38523c93c942-utilities\") pod \"redhat-marketplace-nnr9j\" (UID: \"aaecc794-ede4-47a1-9fe6-38523c93c942\") " pod="openshift-marketplace/redhat-marketplace-nnr9j" Nov 22 04:21:06 crc kubenswrapper[4699]: I1122 04:21:06.861827 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aaecc794-ede4-47a1-9fe6-38523c93c942-catalog-content\") pod \"redhat-marketplace-nnr9j\" (UID: \"aaecc794-ede4-47a1-9fe6-38523c93c942\") " pod="openshift-marketplace/redhat-marketplace-nnr9j" Nov 22 04:21:06 crc kubenswrapper[4699]: I1122 04:21:06.862208 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aaecc794-ede4-47a1-9fe6-38523c93c942-catalog-content\") pod \"redhat-marketplace-nnr9j\" (UID: \"aaecc794-ede4-47a1-9fe6-38523c93c942\") " pod="openshift-marketplace/redhat-marketplace-nnr9j" Nov 22 04:21:06 crc kubenswrapper[4699]: I1122 04:21:06.862576 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6p8f\" (UniqueName: \"kubernetes.io/projected/aaecc794-ede4-47a1-9fe6-38523c93c942-kube-api-access-k6p8f\") pod \"redhat-marketplace-nnr9j\" (UID: \"aaecc794-ede4-47a1-9fe6-38523c93c942\") " pod="openshift-marketplace/redhat-marketplace-nnr9j" Nov 22 04:21:06 crc kubenswrapper[4699]: I1122 04:21:06.886006 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6p8f\" (UniqueName: \"kubernetes.io/projected/aaecc794-ede4-47a1-9fe6-38523c93c942-kube-api-access-k6p8f\") pod \"redhat-marketplace-nnr9j\" (UID: \"aaecc794-ede4-47a1-9fe6-38523c93c942\") " pod="openshift-marketplace/redhat-marketplace-nnr9j" Nov 22 04:21:07 crc kubenswrapper[4699]: I1122 04:21:07.063247 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nnr9j" Nov 22 04:21:07 crc kubenswrapper[4699]: I1122 04:21:07.332648 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nnr9j"] Nov 22 04:21:08 crc kubenswrapper[4699]: I1122 04:21:08.279336 4699 generic.go:334] "Generic (PLEG): container finished" podID="aaecc794-ede4-47a1-9fe6-38523c93c942" containerID="e7ce568a6e677cb1742f08e6dd70fa9b0bc9a37f25c48b7e1f6b2900a3144f84" exitCode=0 Nov 22 04:21:08 crc kubenswrapper[4699]: I1122 04:21:08.279379 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nnr9j" event={"ID":"aaecc794-ede4-47a1-9fe6-38523c93c942","Type":"ContainerDied","Data":"e7ce568a6e677cb1742f08e6dd70fa9b0bc9a37f25c48b7e1f6b2900a3144f84"} Nov 22 04:21:08 crc kubenswrapper[4699]: I1122 04:21:08.279452 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nnr9j" event={"ID":"aaecc794-ede4-47a1-9fe6-38523c93c942","Type":"ContainerStarted","Data":"e3884affa062ec49a33d94918f512b39c8f3e60fe2829f6fc92dd1bc1bedb513"} Nov 22 04:21:08 crc kubenswrapper[4699]: I1122 04:21:08.725682 4699 patch_prober.go:28] interesting pod/machine-config-daemon-kjwnt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:21:08 crc kubenswrapper[4699]: I1122 04:21:08.726015 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:21:09 crc kubenswrapper[4699]: I1122 04:21:09.315755 4699 generic.go:334] "Generic (PLEG): container finished" podID="aaecc794-ede4-47a1-9fe6-38523c93c942" containerID="75d9ce2bb5d44e76acf6f579b8617cfef5aaa70c4d461bac533691998798c036" exitCode=0 Nov 22 04:21:09 crc kubenswrapper[4699]: I1122 04:21:09.315813 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nnr9j" event={"ID":"aaecc794-ede4-47a1-9fe6-38523c93c942","Type":"ContainerDied","Data":"75d9ce2bb5d44e76acf6f579b8617cfef5aaa70c4d461bac533691998798c036"} Nov 22 04:21:10 crc kubenswrapper[4699]: I1122 04:21:10.322210 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nnr9j" event={"ID":"aaecc794-ede4-47a1-9fe6-38523c93c942","Type":"ContainerStarted","Data":"bfe3d88b81b413c26b5fe63f721ccb7ffda8831eb0081c630a2ca7b5c9e3c52d"} Nov 22 04:21:10 crc kubenswrapper[4699]: I1122 04:21:10.358865 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nnr9j" podStartSLOduration=2.887761468 podStartE2EDuration="4.358849435s" podCreationTimestamp="2025-11-22 04:21:06 +0000 UTC" firstStartedPulling="2025-11-22 04:21:08.281523169 +0000 UTC m=+819.624144356" lastFinishedPulling="2025-11-22 04:21:09.752611136 +0000 UTC m=+821.095232323" observedRunningTime="2025-11-22 04:21:10.357274477 +0000 UTC m=+821.699895664" watchObservedRunningTime="2025-11-22 04:21:10.358849435 +0000 UTC m=+821.701470612" Nov 22 04:21:12 crc kubenswrapper[4699]: I1122 04:21:12.117142 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-785654ff4c-blk7v" Nov 22 04:21:17 crc kubenswrapper[4699]: I1122 04:21:17.063832 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nnr9j" Nov 22 04:21:17 crc kubenswrapper[4699]: I1122 04:21:17.065512 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nnr9j" Nov 22 04:21:17 crc kubenswrapper[4699]: I1122 04:21:17.100109 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nnr9j" Nov 22 04:21:17 crc kubenswrapper[4699]: I1122 04:21:17.395378 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nnr9j" Nov 22 04:21:19 crc kubenswrapper[4699]: I1122 04:21:19.478779 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nnr9j"] Nov 22 04:21:19 crc kubenswrapper[4699]: I1122 04:21:19.479032 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nnr9j" podUID="aaecc794-ede4-47a1-9fe6-38523c93c942" containerName="registry-server" containerID="cri-o://bfe3d88b81b413c26b5fe63f721ccb7ffda8831eb0081c630a2ca7b5c9e3c52d" gracePeriod=2 Nov 22 04:21:19 crc kubenswrapper[4699]: E1122 04:21:19.524154 4699 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaaecc794_ede4_47a1_9fe6_38523c93c942.slice/crio-conmon-bfe3d88b81b413c26b5fe63f721ccb7ffda8831eb0081c630a2ca7b5c9e3c52d.scope\": RecentStats: unable to find data in memory cache]" Nov 22 04:21:20 crc kubenswrapper[4699]: I1122 04:21:20.130803 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nnr9j" Nov 22 04:21:20 crc kubenswrapper[4699]: I1122 04:21:20.234745 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aaecc794-ede4-47a1-9fe6-38523c93c942-catalog-content\") pod \"aaecc794-ede4-47a1-9fe6-38523c93c942\" (UID: \"aaecc794-ede4-47a1-9fe6-38523c93c942\") " Nov 22 04:21:20 crc kubenswrapper[4699]: I1122 04:21:20.235080 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6p8f\" (UniqueName: \"kubernetes.io/projected/aaecc794-ede4-47a1-9fe6-38523c93c942-kube-api-access-k6p8f\") pod \"aaecc794-ede4-47a1-9fe6-38523c93c942\" (UID: \"aaecc794-ede4-47a1-9fe6-38523c93c942\") " Nov 22 04:21:20 crc kubenswrapper[4699]: I1122 04:21:20.235151 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aaecc794-ede4-47a1-9fe6-38523c93c942-utilities\") pod \"aaecc794-ede4-47a1-9fe6-38523c93c942\" (UID: \"aaecc794-ede4-47a1-9fe6-38523c93c942\") " Nov 22 04:21:20 crc kubenswrapper[4699]: I1122 04:21:20.236104 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aaecc794-ede4-47a1-9fe6-38523c93c942-utilities" (OuterVolumeSpecName: "utilities") pod "aaecc794-ede4-47a1-9fe6-38523c93c942" (UID: "aaecc794-ede4-47a1-9fe6-38523c93c942"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:21:20 crc kubenswrapper[4699]: I1122 04:21:20.243613 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aaecc794-ede4-47a1-9fe6-38523c93c942-kube-api-access-k6p8f" (OuterVolumeSpecName: "kube-api-access-k6p8f") pod "aaecc794-ede4-47a1-9fe6-38523c93c942" (UID: "aaecc794-ede4-47a1-9fe6-38523c93c942"). InnerVolumeSpecName "kube-api-access-k6p8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:21:20 crc kubenswrapper[4699]: I1122 04:21:20.337026 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6p8f\" (UniqueName: \"kubernetes.io/projected/aaecc794-ede4-47a1-9fe6-38523c93c942-kube-api-access-k6p8f\") on node \"crc\" DevicePath \"\"" Nov 22 04:21:20 crc kubenswrapper[4699]: I1122 04:21:20.337066 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aaecc794-ede4-47a1-9fe6-38523c93c942-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:21:20 crc kubenswrapper[4699]: I1122 04:21:20.376011 4699 generic.go:334] "Generic (PLEG): container finished" podID="aaecc794-ede4-47a1-9fe6-38523c93c942" containerID="bfe3d88b81b413c26b5fe63f721ccb7ffda8831eb0081c630a2ca7b5c9e3c52d" exitCode=0 Nov 22 04:21:20 crc kubenswrapper[4699]: I1122 04:21:20.376246 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nnr9j" Nov 22 04:21:20 crc kubenswrapper[4699]: I1122 04:21:20.376257 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nnr9j" event={"ID":"aaecc794-ede4-47a1-9fe6-38523c93c942","Type":"ContainerDied","Data":"bfe3d88b81b413c26b5fe63f721ccb7ffda8831eb0081c630a2ca7b5c9e3c52d"} Nov 22 04:21:20 crc kubenswrapper[4699]: I1122 04:21:20.376302 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nnr9j" event={"ID":"aaecc794-ede4-47a1-9fe6-38523c93c942","Type":"ContainerDied","Data":"e3884affa062ec49a33d94918f512b39c8f3e60fe2829f6fc92dd1bc1bedb513"} Nov 22 04:21:20 crc kubenswrapper[4699]: I1122 04:21:20.376326 4699 scope.go:117] "RemoveContainer" containerID="bfe3d88b81b413c26b5fe63f721ccb7ffda8831eb0081c630a2ca7b5c9e3c52d" Nov 22 04:21:20 crc kubenswrapper[4699]: I1122 04:21:20.393755 4699 scope.go:117] "RemoveContainer" containerID="75d9ce2bb5d44e76acf6f579b8617cfef5aaa70c4d461bac533691998798c036" Nov 22 04:21:20 crc kubenswrapper[4699]: I1122 04:21:20.407019 4699 scope.go:117] "RemoveContainer" containerID="e7ce568a6e677cb1742f08e6dd70fa9b0bc9a37f25c48b7e1f6b2900a3144f84" Nov 22 04:21:20 crc kubenswrapper[4699]: I1122 04:21:20.423334 4699 scope.go:117] "RemoveContainer" containerID="bfe3d88b81b413c26b5fe63f721ccb7ffda8831eb0081c630a2ca7b5c9e3c52d" Nov 22 04:21:20 crc kubenswrapper[4699]: E1122 04:21:20.424262 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfe3d88b81b413c26b5fe63f721ccb7ffda8831eb0081c630a2ca7b5c9e3c52d\": container with ID starting with bfe3d88b81b413c26b5fe63f721ccb7ffda8831eb0081c630a2ca7b5c9e3c52d not found: ID does not exist" containerID="bfe3d88b81b413c26b5fe63f721ccb7ffda8831eb0081c630a2ca7b5c9e3c52d" Nov 22 04:21:20 crc kubenswrapper[4699]: I1122 04:21:20.424291 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfe3d88b81b413c26b5fe63f721ccb7ffda8831eb0081c630a2ca7b5c9e3c52d"} err="failed to get container status \"bfe3d88b81b413c26b5fe63f721ccb7ffda8831eb0081c630a2ca7b5c9e3c52d\": rpc error: code = NotFound desc = could not find container \"bfe3d88b81b413c26b5fe63f721ccb7ffda8831eb0081c630a2ca7b5c9e3c52d\": container with ID starting with bfe3d88b81b413c26b5fe63f721ccb7ffda8831eb0081c630a2ca7b5c9e3c52d not found: ID does not exist" Nov 22 04:21:20 crc kubenswrapper[4699]: I1122 04:21:20.424311 4699 scope.go:117] "RemoveContainer" containerID="75d9ce2bb5d44e76acf6f579b8617cfef5aaa70c4d461bac533691998798c036" Nov 22 04:21:20 crc kubenswrapper[4699]: E1122 04:21:20.424733 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75d9ce2bb5d44e76acf6f579b8617cfef5aaa70c4d461bac533691998798c036\": container with ID starting with 75d9ce2bb5d44e76acf6f579b8617cfef5aaa70c4d461bac533691998798c036 not found: ID does not exist" containerID="75d9ce2bb5d44e76acf6f579b8617cfef5aaa70c4d461bac533691998798c036" Nov 22 04:21:20 crc kubenswrapper[4699]: I1122 04:21:20.424756 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75d9ce2bb5d44e76acf6f579b8617cfef5aaa70c4d461bac533691998798c036"} err="failed to get container status \"75d9ce2bb5d44e76acf6f579b8617cfef5aaa70c4d461bac533691998798c036\": rpc error: code = NotFound desc = could not find container \"75d9ce2bb5d44e76acf6f579b8617cfef5aaa70c4d461bac533691998798c036\": container with ID starting with 75d9ce2bb5d44e76acf6f579b8617cfef5aaa70c4d461bac533691998798c036 not found: ID does not exist" Nov 22 04:21:20 crc kubenswrapper[4699]: I1122 04:21:20.424771 4699 scope.go:117] "RemoveContainer" containerID="e7ce568a6e677cb1742f08e6dd70fa9b0bc9a37f25c48b7e1f6b2900a3144f84" Nov 22 04:21:20 crc kubenswrapper[4699]: E1122 04:21:20.425145 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7ce568a6e677cb1742f08e6dd70fa9b0bc9a37f25c48b7e1f6b2900a3144f84\": container with ID starting with e7ce568a6e677cb1742f08e6dd70fa9b0bc9a37f25c48b7e1f6b2900a3144f84 not found: ID does not exist" containerID="e7ce568a6e677cb1742f08e6dd70fa9b0bc9a37f25c48b7e1f6b2900a3144f84" Nov 22 04:21:20 crc kubenswrapper[4699]: I1122 04:21:20.425164 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7ce568a6e677cb1742f08e6dd70fa9b0bc9a37f25c48b7e1f6b2900a3144f84"} err="failed to get container status \"e7ce568a6e677cb1742f08e6dd70fa9b0bc9a37f25c48b7e1f6b2900a3144f84\": rpc error: code = NotFound desc = could not find container \"e7ce568a6e677cb1742f08e6dd70fa9b0bc9a37f25c48b7e1f6b2900a3144f84\": container with ID starting with e7ce568a6e677cb1742f08e6dd70fa9b0bc9a37f25c48b7e1f6b2900a3144f84 not found: ID does not exist" Nov 22 04:21:20 crc kubenswrapper[4699]: I1122 04:21:20.504102 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aaecc794-ede4-47a1-9fe6-38523c93c942-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aaecc794-ede4-47a1-9fe6-38523c93c942" (UID: "aaecc794-ede4-47a1-9fe6-38523c93c942"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:21:20 crc kubenswrapper[4699]: I1122 04:21:20.539376 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aaecc794-ede4-47a1-9fe6-38523c93c942-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:21:20 crc kubenswrapper[4699]: I1122 04:21:20.703832 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nnr9j"] Nov 22 04:21:20 crc kubenswrapper[4699]: I1122 04:21:20.712256 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nnr9j"] Nov 22 04:21:21 crc kubenswrapper[4699]: I1122 04:21:21.454741 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aaecc794-ede4-47a1-9fe6-38523c93c942" path="/var/lib/kubelet/pods/aaecc794-ede4-47a1-9fe6-38523c93c942/volumes" Nov 22 04:21:31 crc kubenswrapper[4699]: I1122 04:21:31.845937 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7c65d8d687-6vpd9" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.555741 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-5tpp7"] Nov 22 04:21:32 crc kubenswrapper[4699]: E1122 04:21:32.556288 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaecc794-ede4-47a1-9fe6-38523c93c942" containerName="extract-content" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.556305 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaecc794-ede4-47a1-9fe6-38523c93c942" containerName="extract-content" Nov 22 04:21:32 crc kubenswrapper[4699]: E1122 04:21:32.556321 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaecc794-ede4-47a1-9fe6-38523c93c942" containerName="registry-server" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.556328 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaecc794-ede4-47a1-9fe6-38523c93c942" containerName="registry-server" Nov 22 04:21:32 crc kubenswrapper[4699]: E1122 04:21:32.556340 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaecc794-ede4-47a1-9fe6-38523c93c942" containerName="extract-utilities" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.556346 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaecc794-ede4-47a1-9fe6-38523c93c942" containerName="extract-utilities" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.556454 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="aaecc794-ede4-47a1-9fe6-38523c93c942" containerName="registry-server" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.556869 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-6998585d5-5tpp7" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.559762 4699 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.560652 4699 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-7mmzh" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.562658 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-fjgk6"] Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.565209 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-fjgk6" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.566864 4699 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.567536 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.573903 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-5tpp7"] Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.633714 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-psxdv"] Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.634725 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-psxdv" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.639108 4699 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.640184 4699 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.641227 4699 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-h57rc" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.642073 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.649877 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6c7b4b5f48-g47xp"] Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.650882 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6c7b4b5f48-g47xp" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.652923 4699 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.668110 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6c7b4b5f48-g47xp"] Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.710021 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b94g2\" (UniqueName: \"kubernetes.io/projected/2840ab61-4c34-4132-970e-c6d8c615c2bd-kube-api-access-b94g2\") pod \"frr-k8s-fjgk6\" (UID: \"2840ab61-4c34-4132-970e-c6d8c615c2bd\") " pod="metallb-system/frr-k8s-fjgk6" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.710245 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/2840ab61-4c34-4132-970e-c6d8c615c2bd-reloader\") pod \"frr-k8s-fjgk6\" (UID: \"2840ab61-4c34-4132-970e-c6d8c615c2bd\") " pod="metallb-system/frr-k8s-fjgk6" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.710285 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/2840ab61-4c34-4132-970e-c6d8c615c2bd-frr-conf\") pod \"frr-k8s-fjgk6\" (UID: \"2840ab61-4c34-4132-970e-c6d8c615c2bd\") " pod="metallb-system/frr-k8s-fjgk6" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.710407 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/2840ab61-4c34-4132-970e-c6d8c615c2bd-frr-sockets\") pod \"frr-k8s-fjgk6\" (UID: \"2840ab61-4c34-4132-970e-c6d8c615c2bd\") " pod="metallb-system/frr-k8s-fjgk6" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.710479 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5vq8\" (UniqueName: \"kubernetes.io/projected/3975d03a-cd82-4ae3-89cb-fcad5f75330c-kube-api-access-l5vq8\") pod \"frr-k8s-webhook-server-6998585d5-5tpp7\" (UID: \"3975d03a-cd82-4ae3-89cb-fcad5f75330c\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-5tpp7" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.710519 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/2840ab61-4c34-4132-970e-c6d8c615c2bd-frr-startup\") pod \"frr-k8s-fjgk6\" (UID: \"2840ab61-4c34-4132-970e-c6d8c615c2bd\") " pod="metallb-system/frr-k8s-fjgk6" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.710550 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/2840ab61-4c34-4132-970e-c6d8c615c2bd-metrics\") pod \"frr-k8s-fjgk6\" (UID: \"2840ab61-4c34-4132-970e-c6d8c615c2bd\") " pod="metallb-system/frr-k8s-fjgk6" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.710602 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3975d03a-cd82-4ae3-89cb-fcad5f75330c-cert\") pod \"frr-k8s-webhook-server-6998585d5-5tpp7\" (UID: \"3975d03a-cd82-4ae3-89cb-fcad5f75330c\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-5tpp7" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.710627 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2840ab61-4c34-4132-970e-c6d8c615c2bd-metrics-certs\") pod \"frr-k8s-fjgk6\" (UID: \"2840ab61-4c34-4132-970e-c6d8c615c2bd\") " pod="metallb-system/frr-k8s-fjgk6" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.811821 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmvhp\" (UniqueName: \"kubernetes.io/projected/dae7dee7-2390-47bc-83c9-488f48a4cc90-kube-api-access-dmvhp\") pod \"controller-6c7b4b5f48-g47xp\" (UID: \"dae7dee7-2390-47bc-83c9-488f48a4cc90\") " pod="metallb-system/controller-6c7b4b5f48-g47xp" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.811880 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/07114818-b4f9-465d-9745-c8a05af60e5a-metallb-excludel2\") pod \"speaker-psxdv\" (UID: \"07114818-b4f9-465d-9745-c8a05af60e5a\") " pod="metallb-system/speaker-psxdv" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.811911 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/2840ab61-4c34-4132-970e-c6d8c615c2bd-reloader\") pod \"frr-k8s-fjgk6\" (UID: \"2840ab61-4c34-4132-970e-c6d8c615c2bd\") " pod="metallb-system/frr-k8s-fjgk6" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.811930 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/2840ab61-4c34-4132-970e-c6d8c615c2bd-frr-conf\") pod \"frr-k8s-fjgk6\" (UID: \"2840ab61-4c34-4132-970e-c6d8c615c2bd\") " pod="metallb-system/frr-k8s-fjgk6" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.811956 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/2840ab61-4c34-4132-970e-c6d8c615c2bd-frr-sockets\") pod \"frr-k8s-fjgk6\" (UID: \"2840ab61-4c34-4132-970e-c6d8c615c2bd\") " pod="metallb-system/frr-k8s-fjgk6" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.811976 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5vq8\" (UniqueName: \"kubernetes.io/projected/3975d03a-cd82-4ae3-89cb-fcad5f75330c-kube-api-access-l5vq8\") pod \"frr-k8s-webhook-server-6998585d5-5tpp7\" (UID: \"3975d03a-cd82-4ae3-89cb-fcad5f75330c\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-5tpp7" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.811992 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dae7dee7-2390-47bc-83c9-488f48a4cc90-metrics-certs\") pod \"controller-6c7b4b5f48-g47xp\" (UID: \"dae7dee7-2390-47bc-83c9-488f48a4cc90\") " pod="metallb-system/controller-6c7b4b5f48-g47xp" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.812013 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/2840ab61-4c34-4132-970e-c6d8c615c2bd-frr-startup\") pod \"frr-k8s-fjgk6\" (UID: \"2840ab61-4c34-4132-970e-c6d8c615c2bd\") " pod="metallb-system/frr-k8s-fjgk6" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.812030 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dae7dee7-2390-47bc-83c9-488f48a4cc90-cert\") pod \"controller-6c7b4b5f48-g47xp\" (UID: \"dae7dee7-2390-47bc-83c9-488f48a4cc90\") " pod="metallb-system/controller-6c7b4b5f48-g47xp" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.812049 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/2840ab61-4c34-4132-970e-c6d8c615c2bd-metrics\") pod \"frr-k8s-fjgk6\" (UID: \"2840ab61-4c34-4132-970e-c6d8c615c2bd\") " pod="metallb-system/frr-k8s-fjgk6" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.812074 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3975d03a-cd82-4ae3-89cb-fcad5f75330c-cert\") pod \"frr-k8s-webhook-server-6998585d5-5tpp7\" (UID: \"3975d03a-cd82-4ae3-89cb-fcad5f75330c\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-5tpp7" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.812123 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2840ab61-4c34-4132-970e-c6d8c615c2bd-metrics-certs\") pod \"frr-k8s-fjgk6\" (UID: \"2840ab61-4c34-4132-970e-c6d8c615c2bd\") " pod="metallb-system/frr-k8s-fjgk6" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.812147 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07114818-b4f9-465d-9745-c8a05af60e5a-metrics-certs\") pod \"speaker-psxdv\" (UID: \"07114818-b4f9-465d-9745-c8a05af60e5a\") " pod="metallb-system/speaker-psxdv" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.812185 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/07114818-b4f9-465d-9745-c8a05af60e5a-memberlist\") pod \"speaker-psxdv\" (UID: \"07114818-b4f9-465d-9745-c8a05af60e5a\") " pod="metallb-system/speaker-psxdv" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.812210 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b94g2\" (UniqueName: \"kubernetes.io/projected/2840ab61-4c34-4132-970e-c6d8c615c2bd-kube-api-access-b94g2\") pod \"frr-k8s-fjgk6\" (UID: \"2840ab61-4c34-4132-970e-c6d8c615c2bd\") " pod="metallb-system/frr-k8s-fjgk6" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.812232 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9ghq\" (UniqueName: \"kubernetes.io/projected/07114818-b4f9-465d-9745-c8a05af60e5a-kube-api-access-j9ghq\") pod \"speaker-psxdv\" (UID: \"07114818-b4f9-465d-9745-c8a05af60e5a\") " pod="metallb-system/speaker-psxdv" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.812324 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/2840ab61-4c34-4132-970e-c6d8c615c2bd-reloader\") pod \"frr-k8s-fjgk6\" (UID: \"2840ab61-4c34-4132-970e-c6d8c615c2bd\") " pod="metallb-system/frr-k8s-fjgk6" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.812408 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/2840ab61-4c34-4132-970e-c6d8c615c2bd-frr-sockets\") pod \"frr-k8s-fjgk6\" (UID: \"2840ab61-4c34-4132-970e-c6d8c615c2bd\") " pod="metallb-system/frr-k8s-fjgk6" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.812503 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/2840ab61-4c34-4132-970e-c6d8c615c2bd-frr-conf\") pod \"frr-k8s-fjgk6\" (UID: \"2840ab61-4c34-4132-970e-c6d8c615c2bd\") " pod="metallb-system/frr-k8s-fjgk6" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.812648 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/2840ab61-4c34-4132-970e-c6d8c615c2bd-metrics\") pod \"frr-k8s-fjgk6\" (UID: \"2840ab61-4c34-4132-970e-c6d8c615c2bd\") " pod="metallb-system/frr-k8s-fjgk6" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.813139 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/2840ab61-4c34-4132-970e-c6d8c615c2bd-frr-startup\") pod \"frr-k8s-fjgk6\" (UID: \"2840ab61-4c34-4132-970e-c6d8c615c2bd\") " pod="metallb-system/frr-k8s-fjgk6" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.818525 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3975d03a-cd82-4ae3-89cb-fcad5f75330c-cert\") pod \"frr-k8s-webhook-server-6998585d5-5tpp7\" (UID: \"3975d03a-cd82-4ae3-89cb-fcad5f75330c\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-5tpp7" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.818896 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2840ab61-4c34-4132-970e-c6d8c615c2bd-metrics-certs\") pod \"frr-k8s-fjgk6\" (UID: \"2840ab61-4c34-4132-970e-c6d8c615c2bd\") " pod="metallb-system/frr-k8s-fjgk6" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.831934 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b94g2\" (UniqueName: \"kubernetes.io/projected/2840ab61-4c34-4132-970e-c6d8c615c2bd-kube-api-access-b94g2\") pod \"frr-k8s-fjgk6\" (UID: \"2840ab61-4c34-4132-970e-c6d8c615c2bd\") " pod="metallb-system/frr-k8s-fjgk6" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.848338 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5vq8\" (UniqueName: \"kubernetes.io/projected/3975d03a-cd82-4ae3-89cb-fcad5f75330c-kube-api-access-l5vq8\") pod \"frr-k8s-webhook-server-6998585d5-5tpp7\" (UID: \"3975d03a-cd82-4ae3-89cb-fcad5f75330c\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-5tpp7" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.873416 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-6998585d5-5tpp7" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.880698 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-fjgk6" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.913557 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dae7dee7-2390-47bc-83c9-488f48a4cc90-metrics-certs\") pod \"controller-6c7b4b5f48-g47xp\" (UID: \"dae7dee7-2390-47bc-83c9-488f48a4cc90\") " pod="metallb-system/controller-6c7b4b5f48-g47xp" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.913629 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dae7dee7-2390-47bc-83c9-488f48a4cc90-cert\") pod \"controller-6c7b4b5f48-g47xp\" (UID: \"dae7dee7-2390-47bc-83c9-488f48a4cc90\") " pod="metallb-system/controller-6c7b4b5f48-g47xp" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.913672 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07114818-b4f9-465d-9745-c8a05af60e5a-metrics-certs\") pod \"speaker-psxdv\" (UID: \"07114818-b4f9-465d-9745-c8a05af60e5a\") " pod="metallb-system/speaker-psxdv" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.913710 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/07114818-b4f9-465d-9745-c8a05af60e5a-memberlist\") pod \"speaker-psxdv\" (UID: \"07114818-b4f9-465d-9745-c8a05af60e5a\") " pod="metallb-system/speaker-psxdv" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.913743 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9ghq\" (UniqueName: \"kubernetes.io/projected/07114818-b4f9-465d-9745-c8a05af60e5a-kube-api-access-j9ghq\") pod \"speaker-psxdv\" (UID: \"07114818-b4f9-465d-9745-c8a05af60e5a\") " pod="metallb-system/speaker-psxdv" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.913779 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmvhp\" (UniqueName: \"kubernetes.io/projected/dae7dee7-2390-47bc-83c9-488f48a4cc90-kube-api-access-dmvhp\") pod \"controller-6c7b4b5f48-g47xp\" (UID: \"dae7dee7-2390-47bc-83c9-488f48a4cc90\") " pod="metallb-system/controller-6c7b4b5f48-g47xp" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.913812 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/07114818-b4f9-465d-9745-c8a05af60e5a-metallb-excludel2\") pod \"speaker-psxdv\" (UID: \"07114818-b4f9-465d-9745-c8a05af60e5a\") " pod="metallb-system/speaker-psxdv" Nov 22 04:21:32 crc kubenswrapper[4699]: E1122 04:21:32.914299 4699 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 22 04:21:32 crc kubenswrapper[4699]: E1122 04:21:32.914374 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07114818-b4f9-465d-9745-c8a05af60e5a-memberlist podName:07114818-b4f9-465d-9745-c8a05af60e5a nodeName:}" failed. No retries permitted until 2025-11-22 04:21:33.414354817 +0000 UTC m=+844.756976084 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/07114818-b4f9-465d-9745-c8a05af60e5a-memberlist") pod "speaker-psxdv" (UID: "07114818-b4f9-465d-9745-c8a05af60e5a") : secret "metallb-memberlist" not found Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.914596 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/07114818-b4f9-465d-9745-c8a05af60e5a-metallb-excludel2\") pod \"speaker-psxdv\" (UID: \"07114818-b4f9-465d-9745-c8a05af60e5a\") " pod="metallb-system/speaker-psxdv" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.918624 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07114818-b4f9-465d-9745-c8a05af60e5a-metrics-certs\") pod \"speaker-psxdv\" (UID: \"07114818-b4f9-465d-9745-c8a05af60e5a\") " pod="metallb-system/speaker-psxdv" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.919255 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dae7dee7-2390-47bc-83c9-488f48a4cc90-metrics-certs\") pod \"controller-6c7b4b5f48-g47xp\" (UID: \"dae7dee7-2390-47bc-83c9-488f48a4cc90\") " pod="metallb-system/controller-6c7b4b5f48-g47xp" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.919515 4699 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.931601 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dae7dee7-2390-47bc-83c9-488f48a4cc90-cert\") pod \"controller-6c7b4b5f48-g47xp\" (UID: \"dae7dee7-2390-47bc-83c9-488f48a4cc90\") " pod="metallb-system/controller-6c7b4b5f48-g47xp" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.939947 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9ghq\" (UniqueName: \"kubernetes.io/projected/07114818-b4f9-465d-9745-c8a05af60e5a-kube-api-access-j9ghq\") pod \"speaker-psxdv\" (UID: \"07114818-b4f9-465d-9745-c8a05af60e5a\") " pod="metallb-system/speaker-psxdv" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.940249 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmvhp\" (UniqueName: \"kubernetes.io/projected/dae7dee7-2390-47bc-83c9-488f48a4cc90-kube-api-access-dmvhp\") pod \"controller-6c7b4b5f48-g47xp\" (UID: \"dae7dee7-2390-47bc-83c9-488f48a4cc90\") " pod="metallb-system/controller-6c7b4b5f48-g47xp" Nov 22 04:21:32 crc kubenswrapper[4699]: I1122 04:21:32.966424 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6c7b4b5f48-g47xp" Nov 22 04:21:33 crc kubenswrapper[4699]: I1122 04:21:33.113201 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-5tpp7"] Nov 22 04:21:33 crc kubenswrapper[4699]: W1122 04:21:33.127311 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3975d03a_cd82_4ae3_89cb_fcad5f75330c.slice/crio-718492d0ba45544e254f36ed8ac8061739f102744c6941810530ac794d9b8afc WatchSource:0}: Error finding container 718492d0ba45544e254f36ed8ac8061739f102744c6941810530ac794d9b8afc: Status 404 returned error can't find the container with id 718492d0ba45544e254f36ed8ac8061739f102744c6941810530ac794d9b8afc Nov 22 04:21:33 crc kubenswrapper[4699]: I1122 04:21:33.385639 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6c7b4b5f48-g47xp"] Nov 22 04:21:33 crc kubenswrapper[4699]: W1122 04:21:33.392033 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddae7dee7_2390_47bc_83c9_488f48a4cc90.slice/crio-4e4bd18fcc5a42c945f80cbac839e38f9526d0ebf3b4e81e7c5e701f46e97d65 WatchSource:0}: Error finding container 4e4bd18fcc5a42c945f80cbac839e38f9526d0ebf3b4e81e7c5e701f46e97d65: Status 404 returned error can't find the container with id 4e4bd18fcc5a42c945f80cbac839e38f9526d0ebf3b4e81e7c5e701f46e97d65 Nov 22 04:21:33 crc kubenswrapper[4699]: I1122 04:21:33.420743 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/07114818-b4f9-465d-9745-c8a05af60e5a-memberlist\") pod \"speaker-psxdv\" (UID: \"07114818-b4f9-465d-9745-c8a05af60e5a\") " pod="metallb-system/speaker-psxdv" Nov 22 04:21:33 crc kubenswrapper[4699]: E1122 04:21:33.420960 4699 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 22 04:21:33 crc kubenswrapper[4699]: E1122 04:21:33.421189 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07114818-b4f9-465d-9745-c8a05af60e5a-memberlist podName:07114818-b4f9-465d-9745-c8a05af60e5a nodeName:}" failed. No retries permitted until 2025-11-22 04:21:34.421155509 +0000 UTC m=+845.763776696 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/07114818-b4f9-465d-9745-c8a05af60e5a-memberlist") pod "speaker-psxdv" (UID: "07114818-b4f9-465d-9745-c8a05af60e5a") : secret "metallb-memberlist" not found Nov 22 04:21:33 crc kubenswrapper[4699]: I1122 04:21:33.454370 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fjgk6" event={"ID":"2840ab61-4c34-4132-970e-c6d8c615c2bd","Type":"ContainerStarted","Data":"90381dfcd6e18c7ca34e4d952335f628e6e07186970a7976b5eeb2b887e58c2f"} Nov 22 04:21:33 crc kubenswrapper[4699]: I1122 04:21:33.454411 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-g47xp" event={"ID":"dae7dee7-2390-47bc-83c9-488f48a4cc90","Type":"ContainerStarted","Data":"4e4bd18fcc5a42c945f80cbac839e38f9526d0ebf3b4e81e7c5e701f46e97d65"} Nov 22 04:21:33 crc kubenswrapper[4699]: I1122 04:21:33.454445 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-6998585d5-5tpp7" event={"ID":"3975d03a-cd82-4ae3-89cb-fcad5f75330c","Type":"ContainerStarted","Data":"718492d0ba45544e254f36ed8ac8061739f102744c6941810530ac794d9b8afc"} Nov 22 04:21:34 crc kubenswrapper[4699]: I1122 04:21:34.432992 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/07114818-b4f9-465d-9745-c8a05af60e5a-memberlist\") pod \"speaker-psxdv\" (UID: \"07114818-b4f9-465d-9745-c8a05af60e5a\") " pod="metallb-system/speaker-psxdv" Nov 22 04:21:34 crc kubenswrapper[4699]: I1122 04:21:34.441967 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/07114818-b4f9-465d-9745-c8a05af60e5a-memberlist\") pod \"speaker-psxdv\" (UID: \"07114818-b4f9-465d-9745-c8a05af60e5a\") " pod="metallb-system/speaker-psxdv" Nov 22 04:21:34 crc kubenswrapper[4699]: I1122 04:21:34.450896 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-psxdv" Nov 22 04:21:34 crc kubenswrapper[4699]: I1122 04:21:34.461937 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-g47xp" event={"ID":"dae7dee7-2390-47bc-83c9-488f48a4cc90","Type":"ContainerStarted","Data":"beb9bb6be63b8a2815a1754307b4699d59c001b981392dfca91eafa12b1cb474"} Nov 22 04:21:34 crc kubenswrapper[4699]: I1122 04:21:34.461985 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-g47xp" event={"ID":"dae7dee7-2390-47bc-83c9-488f48a4cc90","Type":"ContainerStarted","Data":"ec63d228db8350c6ae5240a9cac4f6027bf70f1caf476c738d5b878c605e3740"} Nov 22 04:21:34 crc kubenswrapper[4699]: I1122 04:21:34.463020 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6c7b4b5f48-g47xp" Nov 22 04:21:34 crc kubenswrapper[4699]: I1122 04:21:34.484427 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6c7b4b5f48-g47xp" podStartSLOduration=2.48440808 podStartE2EDuration="2.48440808s" podCreationTimestamp="2025-11-22 04:21:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:21:34.483196811 +0000 UTC m=+845.825818018" watchObservedRunningTime="2025-11-22 04:21:34.48440808 +0000 UTC m=+845.827029267" Nov 22 04:21:35 crc kubenswrapper[4699]: I1122 04:21:35.470138 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-psxdv" event={"ID":"07114818-b4f9-465d-9745-c8a05af60e5a","Type":"ContainerStarted","Data":"149968d82487a41ae9f00996af37e4f5ff00035d22051bbdb3e1504fec67d9ac"} Nov 22 04:21:35 crc kubenswrapper[4699]: I1122 04:21:35.470515 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-psxdv" event={"ID":"07114818-b4f9-465d-9745-c8a05af60e5a","Type":"ContainerStarted","Data":"583e9ff7cabf177692d667bb9646d18ad5aa8f7b5f385d24d737217bcc82a880"} Nov 22 04:21:35 crc kubenswrapper[4699]: I1122 04:21:35.470532 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-psxdv" event={"ID":"07114818-b4f9-465d-9745-c8a05af60e5a","Type":"ContainerStarted","Data":"fa2ab5a632f19d70ef5f2f22408f237d716b5c87428dd4ca3512a62183849378"} Nov 22 04:21:35 crc kubenswrapper[4699]: I1122 04:21:35.470686 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-psxdv" Nov 22 04:21:35 crc kubenswrapper[4699]: I1122 04:21:35.486421 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-psxdv" podStartSLOduration=3.4864084 podStartE2EDuration="3.4864084s" podCreationTimestamp="2025-11-22 04:21:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:21:35.484008861 +0000 UTC m=+846.826630068" watchObservedRunningTime="2025-11-22 04:21:35.4864084 +0000 UTC m=+846.829029587" Nov 22 04:21:36 crc kubenswrapper[4699]: I1122 04:21:36.092492 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nnw5h"] Nov 22 04:21:36 crc kubenswrapper[4699]: I1122 04:21:36.093938 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nnw5h" Nov 22 04:21:36 crc kubenswrapper[4699]: I1122 04:21:36.103178 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nnw5h"] Nov 22 04:21:36 crc kubenswrapper[4699]: I1122 04:21:36.262245 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6341646f-dd9d-4645-bf5f-8ef749ba4082-utilities\") pod \"community-operators-nnw5h\" (UID: \"6341646f-dd9d-4645-bf5f-8ef749ba4082\") " pod="openshift-marketplace/community-operators-nnw5h" Nov 22 04:21:36 crc kubenswrapper[4699]: I1122 04:21:36.262552 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5q62\" (UniqueName: \"kubernetes.io/projected/6341646f-dd9d-4645-bf5f-8ef749ba4082-kube-api-access-d5q62\") pod \"community-operators-nnw5h\" (UID: \"6341646f-dd9d-4645-bf5f-8ef749ba4082\") " pod="openshift-marketplace/community-operators-nnw5h" Nov 22 04:21:36 crc kubenswrapper[4699]: I1122 04:21:36.262687 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6341646f-dd9d-4645-bf5f-8ef749ba4082-catalog-content\") pod \"community-operators-nnw5h\" (UID: \"6341646f-dd9d-4645-bf5f-8ef749ba4082\") " pod="openshift-marketplace/community-operators-nnw5h" Nov 22 04:21:36 crc kubenswrapper[4699]: I1122 04:21:36.366528 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5q62\" (UniqueName: \"kubernetes.io/projected/6341646f-dd9d-4645-bf5f-8ef749ba4082-kube-api-access-d5q62\") pod \"community-operators-nnw5h\" (UID: \"6341646f-dd9d-4645-bf5f-8ef749ba4082\") " pod="openshift-marketplace/community-operators-nnw5h" Nov 22 04:21:36 crc kubenswrapper[4699]: I1122 04:21:36.366612 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6341646f-dd9d-4645-bf5f-8ef749ba4082-catalog-content\") pod \"community-operators-nnw5h\" (UID: \"6341646f-dd9d-4645-bf5f-8ef749ba4082\") " pod="openshift-marketplace/community-operators-nnw5h" Nov 22 04:21:36 crc kubenswrapper[4699]: I1122 04:21:36.366642 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6341646f-dd9d-4645-bf5f-8ef749ba4082-utilities\") pod \"community-operators-nnw5h\" (UID: \"6341646f-dd9d-4645-bf5f-8ef749ba4082\") " pod="openshift-marketplace/community-operators-nnw5h" Nov 22 04:21:36 crc kubenswrapper[4699]: I1122 04:21:36.367174 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6341646f-dd9d-4645-bf5f-8ef749ba4082-catalog-content\") pod \"community-operators-nnw5h\" (UID: \"6341646f-dd9d-4645-bf5f-8ef749ba4082\") " pod="openshift-marketplace/community-operators-nnw5h" Nov 22 04:21:36 crc kubenswrapper[4699]: I1122 04:21:36.367287 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6341646f-dd9d-4645-bf5f-8ef749ba4082-utilities\") pod \"community-operators-nnw5h\" (UID: \"6341646f-dd9d-4645-bf5f-8ef749ba4082\") " pod="openshift-marketplace/community-operators-nnw5h" Nov 22 04:21:36 crc kubenswrapper[4699]: I1122 04:21:36.396746 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5q62\" (UniqueName: \"kubernetes.io/projected/6341646f-dd9d-4645-bf5f-8ef749ba4082-kube-api-access-d5q62\") pod \"community-operators-nnw5h\" (UID: \"6341646f-dd9d-4645-bf5f-8ef749ba4082\") " pod="openshift-marketplace/community-operators-nnw5h" Nov 22 04:21:36 crc kubenswrapper[4699]: I1122 04:21:36.412620 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nnw5h" Nov 22 04:21:36 crc kubenswrapper[4699]: I1122 04:21:36.813869 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nnw5h"] Nov 22 04:21:37 crc kubenswrapper[4699]: I1122 04:21:37.488844 4699 generic.go:334] "Generic (PLEG): container finished" podID="6341646f-dd9d-4645-bf5f-8ef749ba4082" containerID="7b6259fc5ef417620830d1a326b26766cb5ec491f19bfbe7fcacf90131899caf" exitCode=0 Nov 22 04:21:37 crc kubenswrapper[4699]: I1122 04:21:37.489036 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nnw5h" event={"ID":"6341646f-dd9d-4645-bf5f-8ef749ba4082","Type":"ContainerDied","Data":"7b6259fc5ef417620830d1a326b26766cb5ec491f19bfbe7fcacf90131899caf"} Nov 22 04:21:37 crc kubenswrapper[4699]: I1122 04:21:37.489207 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nnw5h" event={"ID":"6341646f-dd9d-4645-bf5f-8ef749ba4082","Type":"ContainerStarted","Data":"222c3105075db03864606cd74893bb4e8b98bdf3932a2fa2af9cb9174ae1a993"} Nov 22 04:21:38 crc kubenswrapper[4699]: I1122 04:21:38.728037 4699 patch_prober.go:28] interesting pod/machine-config-daemon-kjwnt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:21:38 crc kubenswrapper[4699]: I1122 04:21:38.728116 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:21:38 crc kubenswrapper[4699]: I1122 04:21:38.728174 4699 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" Nov 22 04:21:38 crc kubenswrapper[4699]: I1122 04:21:38.728951 4699 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"199b50f4c2609410414bdb3fb89b173b5d648f7f42f86d10fd711b75ac95c283"} pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 04:21:38 crc kubenswrapper[4699]: I1122 04:21:38.729018 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerName="machine-config-daemon" containerID="cri-o://199b50f4c2609410414bdb3fb89b173b5d648f7f42f86d10fd711b75ac95c283" gracePeriod=600 Nov 22 04:21:39 crc kubenswrapper[4699]: I1122 04:21:39.511077 4699 generic.go:334] "Generic (PLEG): container finished" podID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerID="199b50f4c2609410414bdb3fb89b173b5d648f7f42f86d10fd711b75ac95c283" exitCode=0 Nov 22 04:21:39 crc kubenswrapper[4699]: I1122 04:21:39.511137 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" event={"ID":"41bdbae2-706a-4f84-9f56-5a42aec77762","Type":"ContainerDied","Data":"199b50f4c2609410414bdb3fb89b173b5d648f7f42f86d10fd711b75ac95c283"} Nov 22 04:21:39 crc kubenswrapper[4699]: I1122 04:21:39.511174 4699 scope.go:117] "RemoveContainer" containerID="11c39a836dc5f876dbc167373de2ea59c5626be5c7a823f2dfb4fec915dd5ecc" Nov 22 04:21:41 crc kubenswrapper[4699]: I1122 04:21:41.524381 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" event={"ID":"41bdbae2-706a-4f84-9f56-5a42aec77762","Type":"ContainerStarted","Data":"860a4fc3095846d1c30f6bfb9c79f3b411c14f316e6ed54ad090c3a0186b2e5c"} Nov 22 04:21:41 crc kubenswrapper[4699]: I1122 04:21:41.525727 4699 generic.go:334] "Generic (PLEG): container finished" podID="2840ab61-4c34-4132-970e-c6d8c615c2bd" containerID="e85f92a01f47c5ba0b12017123536acd1fcd275ee0a0390a4f4a5e38ebca7bca" exitCode=0 Nov 22 04:21:41 crc kubenswrapper[4699]: I1122 04:21:41.525805 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fjgk6" event={"ID":"2840ab61-4c34-4132-970e-c6d8c615c2bd","Type":"ContainerDied","Data":"e85f92a01f47c5ba0b12017123536acd1fcd275ee0a0390a4f4a5e38ebca7bca"} Nov 22 04:21:41 crc kubenswrapper[4699]: I1122 04:21:41.528722 4699 generic.go:334] "Generic (PLEG): container finished" podID="6341646f-dd9d-4645-bf5f-8ef749ba4082" containerID="176b029e25b2ec39e2a19e4baf9d6d41b547e9aa31e19fbdb69f1e936c0ff3cc" exitCode=0 Nov 22 04:21:41 crc kubenswrapper[4699]: I1122 04:21:41.528814 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nnw5h" event={"ID":"6341646f-dd9d-4645-bf5f-8ef749ba4082","Type":"ContainerDied","Data":"176b029e25b2ec39e2a19e4baf9d6d41b547e9aa31e19fbdb69f1e936c0ff3cc"} Nov 22 04:21:41 crc kubenswrapper[4699]: I1122 04:21:41.530321 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-6998585d5-5tpp7" event={"ID":"3975d03a-cd82-4ae3-89cb-fcad5f75330c","Type":"ContainerStarted","Data":"9a5181867f4be21b412a8de9be969d3bf1647d6aaf00a557682dafdbe32e4a56"} Nov 22 04:21:41 crc kubenswrapper[4699]: I1122 04:21:41.530486 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-6998585d5-5tpp7" Nov 22 04:21:41 crc kubenswrapper[4699]: I1122 04:21:41.576649 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-6998585d5-5tpp7" podStartSLOduration=1.8056984950000001 podStartE2EDuration="9.576621926s" podCreationTimestamp="2025-11-22 04:21:32 +0000 UTC" firstStartedPulling="2025-11-22 04:21:33.129760597 +0000 UTC m=+844.472381784" lastFinishedPulling="2025-11-22 04:21:40.900684018 +0000 UTC m=+852.243305215" observedRunningTime="2025-11-22 04:21:41.55191407 +0000 UTC m=+852.894535257" watchObservedRunningTime="2025-11-22 04:21:41.576621926 +0000 UTC m=+852.919243123" Nov 22 04:21:42 crc kubenswrapper[4699]: I1122 04:21:42.536920 4699 generic.go:334] "Generic (PLEG): container finished" podID="2840ab61-4c34-4132-970e-c6d8c615c2bd" containerID="aae8e07f2f65215a8d0054dd773a86fee049096e7816c3261b6ceddc20b3d3e5" exitCode=0 Nov 22 04:21:42 crc kubenswrapper[4699]: I1122 04:21:42.536978 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fjgk6" event={"ID":"2840ab61-4c34-4132-970e-c6d8c615c2bd","Type":"ContainerDied","Data":"aae8e07f2f65215a8d0054dd773a86fee049096e7816c3261b6ceddc20b3d3e5"} Nov 22 04:21:42 crc kubenswrapper[4699]: I1122 04:21:42.539162 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nnw5h" event={"ID":"6341646f-dd9d-4645-bf5f-8ef749ba4082","Type":"ContainerStarted","Data":"3f44502400193f23cf50b7a0aac7a619d4e376f6d86fcdf8303adcc778f547d9"} Nov 22 04:21:43 crc kubenswrapper[4699]: I1122 04:21:43.547177 4699 generic.go:334] "Generic (PLEG): container finished" podID="2840ab61-4c34-4132-970e-c6d8c615c2bd" containerID="85b05c6fa2aa174d63771d186da6ed32d5b245a9f7a7cbe1f75cb75189b93937" exitCode=0 Nov 22 04:21:43 crc kubenswrapper[4699]: I1122 04:21:43.547219 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fjgk6" event={"ID":"2840ab61-4c34-4132-970e-c6d8c615c2bd","Type":"ContainerDied","Data":"85b05c6fa2aa174d63771d186da6ed32d5b245a9f7a7cbe1f75cb75189b93937"} Nov 22 04:21:43 crc kubenswrapper[4699]: I1122 04:21:43.571261 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nnw5h" podStartSLOduration=2.8718060579999998 podStartE2EDuration="7.571240555s" podCreationTimestamp="2025-11-22 04:21:36 +0000 UTC" firstStartedPulling="2025-11-22 04:21:37.492267365 +0000 UTC m=+848.834888552" lastFinishedPulling="2025-11-22 04:21:42.191701862 +0000 UTC m=+853.534323049" observedRunningTime="2025-11-22 04:21:42.585855683 +0000 UTC m=+853.928476890" watchObservedRunningTime="2025-11-22 04:21:43.571240555 +0000 UTC m=+854.913861752" Nov 22 04:21:44 crc kubenswrapper[4699]: I1122 04:21:44.458165 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-psxdv" Nov 22 04:21:44 crc kubenswrapper[4699]: I1122 04:21:44.563968 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fjgk6" event={"ID":"2840ab61-4c34-4132-970e-c6d8c615c2bd","Type":"ContainerStarted","Data":"6df0b6dfcad38058e258448a9f55ba00d6216c4b540c47a515afb466494b13ad"} Nov 22 04:21:44 crc kubenswrapper[4699]: I1122 04:21:44.564017 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fjgk6" event={"ID":"2840ab61-4c34-4132-970e-c6d8c615c2bd","Type":"ContainerStarted","Data":"285e19da8fb20aee156085921fdafbb82833e0073f9b863f14bffe6ef8f94889"} Nov 22 04:21:44 crc kubenswrapper[4699]: I1122 04:21:44.564028 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fjgk6" event={"ID":"2840ab61-4c34-4132-970e-c6d8c615c2bd","Type":"ContainerStarted","Data":"e2b72fc9d4afabebecc5ee479664fdc10934db0cbfc124917ffacf0503683f9c"} Nov 22 04:21:44 crc kubenswrapper[4699]: I1122 04:21:44.564040 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fjgk6" event={"ID":"2840ab61-4c34-4132-970e-c6d8c615c2bd","Type":"ContainerStarted","Data":"3c9e17def9fbab093949e079da603f7ab6ad462ef5ab66cfb0cdd5d8e6f1b410"} Nov 22 04:21:45 crc kubenswrapper[4699]: I1122 04:21:45.573465 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fjgk6" event={"ID":"2840ab61-4c34-4132-970e-c6d8c615c2bd","Type":"ContainerStarted","Data":"ee185434d85767188cc7ef33c0aa72fbff031ceacf156e381390e849c8664678"} Nov 22 04:21:45 crc kubenswrapper[4699]: I1122 04:21:45.575324 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-fjgk6" event={"ID":"2840ab61-4c34-4132-970e-c6d8c615c2bd","Type":"ContainerStarted","Data":"1a971da345c89abbdd00ee4cf92873b30525171fbd329100db467dd20c5dbd71"} Nov 22 04:21:45 crc kubenswrapper[4699]: I1122 04:21:45.575463 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-fjgk6" Nov 22 04:21:45 crc kubenswrapper[4699]: I1122 04:21:45.602594 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-fjgk6" podStartSLOduration=5.755566039 podStartE2EDuration="13.602572085s" podCreationTimestamp="2025-11-22 04:21:32 +0000 UTC" firstStartedPulling="2025-11-22 04:21:33.104112128 +0000 UTC m=+844.446733315" lastFinishedPulling="2025-11-22 04:21:40.951118184 +0000 UTC m=+852.293739361" observedRunningTime="2025-11-22 04:21:45.597744367 +0000 UTC m=+856.940365574" watchObservedRunningTime="2025-11-22 04:21:45.602572085 +0000 UTC m=+856.945193282" Nov 22 04:21:46 crc kubenswrapper[4699]: I1122 04:21:46.413041 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nnw5h" Nov 22 04:21:46 crc kubenswrapper[4699]: I1122 04:21:46.413105 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nnw5h" Nov 22 04:21:46 crc kubenswrapper[4699]: I1122 04:21:46.459863 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nnw5h" Nov 22 04:21:47 crc kubenswrapper[4699]: I1122 04:21:47.881495 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-fjgk6" Nov 22 04:21:47 crc kubenswrapper[4699]: I1122 04:21:47.920905 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-fjgk6" Nov 22 04:21:51 crc kubenswrapper[4699]: I1122 04:21:51.486054 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-wnnv6"] Nov 22 04:21:51 crc kubenswrapper[4699]: I1122 04:21:51.487121 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wnnv6" Nov 22 04:21:51 crc kubenswrapper[4699]: I1122 04:21:51.489145 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Nov 22 04:21:51 crc kubenswrapper[4699]: I1122 04:21:51.489368 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Nov 22 04:21:51 crc kubenswrapper[4699]: I1122 04:21:51.489597 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-rpf5p" Nov 22 04:21:51 crc kubenswrapper[4699]: I1122 04:21:51.494284 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-wnnv6"] Nov 22 04:21:51 crc kubenswrapper[4699]: I1122 04:21:51.573142 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nht5k\" (UniqueName: \"kubernetes.io/projected/0bb24428-cae6-49f4-b4d7-5a33488d5e2e-kube-api-access-nht5k\") pod \"openstack-operator-index-wnnv6\" (UID: \"0bb24428-cae6-49f4-b4d7-5a33488d5e2e\") " pod="openstack-operators/openstack-operator-index-wnnv6" Nov 22 04:21:51 crc kubenswrapper[4699]: I1122 04:21:51.674685 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nht5k\" (UniqueName: \"kubernetes.io/projected/0bb24428-cae6-49f4-b4d7-5a33488d5e2e-kube-api-access-nht5k\") pod \"openstack-operator-index-wnnv6\" (UID: \"0bb24428-cae6-49f4-b4d7-5a33488d5e2e\") " pod="openstack-operators/openstack-operator-index-wnnv6" Nov 22 04:21:51 crc kubenswrapper[4699]: I1122 04:21:51.699231 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nht5k\" (UniqueName: \"kubernetes.io/projected/0bb24428-cae6-49f4-b4d7-5a33488d5e2e-kube-api-access-nht5k\") pod \"openstack-operator-index-wnnv6\" (UID: \"0bb24428-cae6-49f4-b4d7-5a33488d5e2e\") " pod="openstack-operators/openstack-operator-index-wnnv6" Nov 22 04:21:51 crc kubenswrapper[4699]: I1122 04:21:51.804647 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wnnv6" Nov 22 04:21:52 crc kubenswrapper[4699]: I1122 04:21:52.245798 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-wnnv6"] Nov 22 04:21:52 crc kubenswrapper[4699]: I1122 04:21:52.614071 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wnnv6" event={"ID":"0bb24428-cae6-49f4-b4d7-5a33488d5e2e","Type":"ContainerStarted","Data":"038367e5dde8bba3b8e47ed404302230ab697372d84f0902ae9dd3c7033ad6bc"} Nov 22 04:21:52 crc kubenswrapper[4699]: I1122 04:21:52.878671 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-6998585d5-5tpp7" Nov 22 04:21:52 crc kubenswrapper[4699]: I1122 04:21:52.974219 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6c7b4b5f48-g47xp" Nov 22 04:21:56 crc kubenswrapper[4699]: I1122 04:21:56.454576 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nnw5h" Nov 22 04:21:59 crc kubenswrapper[4699]: I1122 04:21:59.654591 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wnnv6" event={"ID":"0bb24428-cae6-49f4-b4d7-5a33488d5e2e","Type":"ContainerStarted","Data":"9cc5a11c14d335f8b74476cabcaa4cf7e1187b98c5896349b12255de95d5400f"} Nov 22 04:21:59 crc kubenswrapper[4699]: I1122 04:21:59.667897 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-wnnv6" podStartSLOduration=1.738368376 podStartE2EDuration="8.667878195s" podCreationTimestamp="2025-11-22 04:21:51 +0000 UTC" firstStartedPulling="2025-11-22 04:21:52.257256674 +0000 UTC m=+863.599877861" lastFinishedPulling="2025-11-22 04:21:59.186766493 +0000 UTC m=+870.529387680" observedRunningTime="2025-11-22 04:21:59.666316807 +0000 UTC m=+871.008938014" watchObservedRunningTime="2025-11-22 04:21:59.667878195 +0000 UTC m=+871.010499382" Nov 22 04:22:01 crc kubenswrapper[4699]: I1122 04:22:01.079335 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nnw5h"] Nov 22 04:22:01 crc kubenswrapper[4699]: I1122 04:22:01.079940 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nnw5h" podUID="6341646f-dd9d-4645-bf5f-8ef749ba4082" containerName="registry-server" containerID="cri-o://3f44502400193f23cf50b7a0aac7a619d4e376f6d86fcdf8303adcc778f547d9" gracePeriod=2 Nov 22 04:22:01 crc kubenswrapper[4699]: I1122 04:22:01.547372 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nnw5h" Nov 22 04:22:01 crc kubenswrapper[4699]: I1122 04:22:01.619748 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5q62\" (UniqueName: \"kubernetes.io/projected/6341646f-dd9d-4645-bf5f-8ef749ba4082-kube-api-access-d5q62\") pod \"6341646f-dd9d-4645-bf5f-8ef749ba4082\" (UID: \"6341646f-dd9d-4645-bf5f-8ef749ba4082\") " Nov 22 04:22:01 crc kubenswrapper[4699]: I1122 04:22:01.619834 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6341646f-dd9d-4645-bf5f-8ef749ba4082-utilities\") pod \"6341646f-dd9d-4645-bf5f-8ef749ba4082\" (UID: \"6341646f-dd9d-4645-bf5f-8ef749ba4082\") " Nov 22 04:22:01 crc kubenswrapper[4699]: I1122 04:22:01.619904 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6341646f-dd9d-4645-bf5f-8ef749ba4082-catalog-content\") pod \"6341646f-dd9d-4645-bf5f-8ef749ba4082\" (UID: \"6341646f-dd9d-4645-bf5f-8ef749ba4082\") " Nov 22 04:22:01 crc kubenswrapper[4699]: I1122 04:22:01.621625 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6341646f-dd9d-4645-bf5f-8ef749ba4082-utilities" (OuterVolumeSpecName: "utilities") pod "6341646f-dd9d-4645-bf5f-8ef749ba4082" (UID: "6341646f-dd9d-4645-bf5f-8ef749ba4082"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:22:01 crc kubenswrapper[4699]: I1122 04:22:01.631382 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6341646f-dd9d-4645-bf5f-8ef749ba4082-kube-api-access-d5q62" (OuterVolumeSpecName: "kube-api-access-d5q62") pod "6341646f-dd9d-4645-bf5f-8ef749ba4082" (UID: "6341646f-dd9d-4645-bf5f-8ef749ba4082"). InnerVolumeSpecName "kube-api-access-d5q62". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:22:01 crc kubenswrapper[4699]: I1122 04:22:01.668392 4699 generic.go:334] "Generic (PLEG): container finished" podID="6341646f-dd9d-4645-bf5f-8ef749ba4082" containerID="3f44502400193f23cf50b7a0aac7a619d4e376f6d86fcdf8303adcc778f547d9" exitCode=0 Nov 22 04:22:01 crc kubenswrapper[4699]: I1122 04:22:01.668448 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nnw5h" event={"ID":"6341646f-dd9d-4645-bf5f-8ef749ba4082","Type":"ContainerDied","Data":"3f44502400193f23cf50b7a0aac7a619d4e376f6d86fcdf8303adcc778f547d9"} Nov 22 04:22:01 crc kubenswrapper[4699]: I1122 04:22:01.668475 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nnw5h" event={"ID":"6341646f-dd9d-4645-bf5f-8ef749ba4082","Type":"ContainerDied","Data":"222c3105075db03864606cd74893bb4e8b98bdf3932a2fa2af9cb9174ae1a993"} Nov 22 04:22:01 crc kubenswrapper[4699]: I1122 04:22:01.668501 4699 scope.go:117] "RemoveContainer" containerID="3f44502400193f23cf50b7a0aac7a619d4e376f6d86fcdf8303adcc778f547d9" Nov 22 04:22:01 crc kubenswrapper[4699]: I1122 04:22:01.668451 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nnw5h" Nov 22 04:22:01 crc kubenswrapper[4699]: I1122 04:22:01.670806 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6341646f-dd9d-4645-bf5f-8ef749ba4082-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6341646f-dd9d-4645-bf5f-8ef749ba4082" (UID: "6341646f-dd9d-4645-bf5f-8ef749ba4082"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:22:01 crc kubenswrapper[4699]: I1122 04:22:01.683546 4699 scope.go:117] "RemoveContainer" containerID="176b029e25b2ec39e2a19e4baf9d6d41b547e9aa31e19fbdb69f1e936c0ff3cc" Nov 22 04:22:01 crc kubenswrapper[4699]: I1122 04:22:01.700710 4699 scope.go:117] "RemoveContainer" containerID="7b6259fc5ef417620830d1a326b26766cb5ec491f19bfbe7fcacf90131899caf" Nov 22 04:22:01 crc kubenswrapper[4699]: I1122 04:22:01.717411 4699 scope.go:117] "RemoveContainer" containerID="3f44502400193f23cf50b7a0aac7a619d4e376f6d86fcdf8303adcc778f547d9" Nov 22 04:22:01 crc kubenswrapper[4699]: E1122 04:22:01.717849 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f44502400193f23cf50b7a0aac7a619d4e376f6d86fcdf8303adcc778f547d9\": container with ID starting with 3f44502400193f23cf50b7a0aac7a619d4e376f6d86fcdf8303adcc778f547d9 not found: ID does not exist" containerID="3f44502400193f23cf50b7a0aac7a619d4e376f6d86fcdf8303adcc778f547d9" Nov 22 04:22:01 crc kubenswrapper[4699]: I1122 04:22:01.717896 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f44502400193f23cf50b7a0aac7a619d4e376f6d86fcdf8303adcc778f547d9"} err="failed to get container status \"3f44502400193f23cf50b7a0aac7a619d4e376f6d86fcdf8303adcc778f547d9\": rpc error: code = NotFound desc = could not find container \"3f44502400193f23cf50b7a0aac7a619d4e376f6d86fcdf8303adcc778f547d9\": container with ID starting with 3f44502400193f23cf50b7a0aac7a619d4e376f6d86fcdf8303adcc778f547d9 not found: ID does not exist" Nov 22 04:22:01 crc kubenswrapper[4699]: I1122 04:22:01.717928 4699 scope.go:117] "RemoveContainer" containerID="176b029e25b2ec39e2a19e4baf9d6d41b547e9aa31e19fbdb69f1e936c0ff3cc" Nov 22 04:22:01 crc kubenswrapper[4699]: E1122 04:22:01.718301 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"176b029e25b2ec39e2a19e4baf9d6d41b547e9aa31e19fbdb69f1e936c0ff3cc\": container with ID starting with 176b029e25b2ec39e2a19e4baf9d6d41b547e9aa31e19fbdb69f1e936c0ff3cc not found: ID does not exist" containerID="176b029e25b2ec39e2a19e4baf9d6d41b547e9aa31e19fbdb69f1e936c0ff3cc" Nov 22 04:22:01 crc kubenswrapper[4699]: I1122 04:22:01.718331 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"176b029e25b2ec39e2a19e4baf9d6d41b547e9aa31e19fbdb69f1e936c0ff3cc"} err="failed to get container status \"176b029e25b2ec39e2a19e4baf9d6d41b547e9aa31e19fbdb69f1e936c0ff3cc\": rpc error: code = NotFound desc = could not find container \"176b029e25b2ec39e2a19e4baf9d6d41b547e9aa31e19fbdb69f1e936c0ff3cc\": container with ID starting with 176b029e25b2ec39e2a19e4baf9d6d41b547e9aa31e19fbdb69f1e936c0ff3cc not found: ID does not exist" Nov 22 04:22:01 crc kubenswrapper[4699]: I1122 04:22:01.718353 4699 scope.go:117] "RemoveContainer" containerID="7b6259fc5ef417620830d1a326b26766cb5ec491f19bfbe7fcacf90131899caf" Nov 22 04:22:01 crc kubenswrapper[4699]: E1122 04:22:01.718608 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b6259fc5ef417620830d1a326b26766cb5ec491f19bfbe7fcacf90131899caf\": container with ID starting with 7b6259fc5ef417620830d1a326b26766cb5ec491f19bfbe7fcacf90131899caf not found: ID does not exist" containerID="7b6259fc5ef417620830d1a326b26766cb5ec491f19bfbe7fcacf90131899caf" Nov 22 04:22:01 crc kubenswrapper[4699]: I1122 04:22:01.718628 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b6259fc5ef417620830d1a326b26766cb5ec491f19bfbe7fcacf90131899caf"} err="failed to get container status \"7b6259fc5ef417620830d1a326b26766cb5ec491f19bfbe7fcacf90131899caf\": rpc error: code = NotFound desc = could not find container \"7b6259fc5ef417620830d1a326b26766cb5ec491f19bfbe7fcacf90131899caf\": container with ID starting with 7b6259fc5ef417620830d1a326b26766cb5ec491f19bfbe7fcacf90131899caf not found: ID does not exist" Nov 22 04:22:01 crc kubenswrapper[4699]: I1122 04:22:01.721270 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6341646f-dd9d-4645-bf5f-8ef749ba4082-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:22:01 crc kubenswrapper[4699]: I1122 04:22:01.721293 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5q62\" (UniqueName: \"kubernetes.io/projected/6341646f-dd9d-4645-bf5f-8ef749ba4082-kube-api-access-d5q62\") on node \"crc\" DevicePath \"\"" Nov 22 04:22:01 crc kubenswrapper[4699]: I1122 04:22:01.721306 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6341646f-dd9d-4645-bf5f-8ef749ba4082-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:22:01 crc kubenswrapper[4699]: I1122 04:22:01.805418 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-wnnv6" Nov 22 04:22:01 crc kubenswrapper[4699]: I1122 04:22:01.805502 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-wnnv6" Nov 22 04:22:01 crc kubenswrapper[4699]: I1122 04:22:01.828560 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-wnnv6" Nov 22 04:22:01 crc kubenswrapper[4699]: I1122 04:22:01.995371 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nnw5h"] Nov 22 04:22:01 crc kubenswrapper[4699]: I1122 04:22:01.999828 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nnw5h"] Nov 22 04:22:02 crc kubenswrapper[4699]: I1122 04:22:02.887321 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-fjgk6" Nov 22 04:22:03 crc kubenswrapper[4699]: I1122 04:22:03.453692 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6341646f-dd9d-4645-bf5f-8ef749ba4082" path="/var/lib/kubelet/pods/6341646f-dd9d-4645-bf5f-8ef749ba4082/volumes" Nov 22 04:22:11 crc kubenswrapper[4699]: I1122 04:22:11.839808 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-wnnv6" Nov 22 04:22:20 crc kubenswrapper[4699]: I1122 04:22:20.514518 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/0702d9261541ca7ab33cc0f5bb569a6098e591a9e02db10dc12f9a2708fnb5w"] Nov 22 04:22:20 crc kubenswrapper[4699]: E1122 04:22:20.516275 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6341646f-dd9d-4645-bf5f-8ef749ba4082" containerName="extract-content" Nov 22 04:22:20 crc kubenswrapper[4699]: I1122 04:22:20.516366 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="6341646f-dd9d-4645-bf5f-8ef749ba4082" containerName="extract-content" Nov 22 04:22:20 crc kubenswrapper[4699]: E1122 04:22:20.516469 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6341646f-dd9d-4645-bf5f-8ef749ba4082" containerName="extract-utilities" Nov 22 04:22:20 crc kubenswrapper[4699]: I1122 04:22:20.516560 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="6341646f-dd9d-4645-bf5f-8ef749ba4082" containerName="extract-utilities" Nov 22 04:22:20 crc kubenswrapper[4699]: E1122 04:22:20.516636 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6341646f-dd9d-4645-bf5f-8ef749ba4082" containerName="registry-server" Nov 22 04:22:20 crc kubenswrapper[4699]: I1122 04:22:20.516695 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="6341646f-dd9d-4645-bf5f-8ef749ba4082" containerName="registry-server" Nov 22 04:22:20 crc kubenswrapper[4699]: I1122 04:22:20.516883 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="6341646f-dd9d-4645-bf5f-8ef749ba4082" containerName="registry-server" Nov 22 04:22:20 crc kubenswrapper[4699]: I1122 04:22:20.517827 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0702d9261541ca7ab33cc0f5bb569a6098e591a9e02db10dc12f9a2708fnb5w" Nov 22 04:22:20 crc kubenswrapper[4699]: I1122 04:22:20.521178 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-76wj6" Nov 22 04:22:20 crc kubenswrapper[4699]: I1122 04:22:20.527569 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0702d9261541ca7ab33cc0f5bb569a6098e591a9e02db10dc12f9a2708fnb5w"] Nov 22 04:22:20 crc kubenswrapper[4699]: I1122 04:22:20.704230 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d98e67ec-e732-4646-859f-5dcf61d03def-util\") pod \"0702d9261541ca7ab33cc0f5bb569a6098e591a9e02db10dc12f9a2708fnb5w\" (UID: \"d98e67ec-e732-4646-859f-5dcf61d03def\") " pod="openstack-operators/0702d9261541ca7ab33cc0f5bb569a6098e591a9e02db10dc12f9a2708fnb5w" Nov 22 04:22:20 crc kubenswrapper[4699]: I1122 04:22:20.704292 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzdtw\" (UniqueName: \"kubernetes.io/projected/d98e67ec-e732-4646-859f-5dcf61d03def-kube-api-access-xzdtw\") pod \"0702d9261541ca7ab33cc0f5bb569a6098e591a9e02db10dc12f9a2708fnb5w\" (UID: \"d98e67ec-e732-4646-859f-5dcf61d03def\") " pod="openstack-operators/0702d9261541ca7ab33cc0f5bb569a6098e591a9e02db10dc12f9a2708fnb5w" Nov 22 04:22:20 crc kubenswrapper[4699]: I1122 04:22:20.704390 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d98e67ec-e732-4646-859f-5dcf61d03def-bundle\") pod \"0702d9261541ca7ab33cc0f5bb569a6098e591a9e02db10dc12f9a2708fnb5w\" (UID: \"d98e67ec-e732-4646-859f-5dcf61d03def\") " pod="openstack-operators/0702d9261541ca7ab33cc0f5bb569a6098e591a9e02db10dc12f9a2708fnb5w" Nov 22 04:22:20 crc kubenswrapper[4699]: I1122 04:22:20.805883 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d98e67ec-e732-4646-859f-5dcf61d03def-bundle\") pod \"0702d9261541ca7ab33cc0f5bb569a6098e591a9e02db10dc12f9a2708fnb5w\" (UID: \"d98e67ec-e732-4646-859f-5dcf61d03def\") " pod="openstack-operators/0702d9261541ca7ab33cc0f5bb569a6098e591a9e02db10dc12f9a2708fnb5w" Nov 22 04:22:20 crc kubenswrapper[4699]: I1122 04:22:20.805960 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d98e67ec-e732-4646-859f-5dcf61d03def-util\") pod \"0702d9261541ca7ab33cc0f5bb569a6098e591a9e02db10dc12f9a2708fnb5w\" (UID: \"d98e67ec-e732-4646-859f-5dcf61d03def\") " pod="openstack-operators/0702d9261541ca7ab33cc0f5bb569a6098e591a9e02db10dc12f9a2708fnb5w" Nov 22 04:22:20 crc kubenswrapper[4699]: I1122 04:22:20.806005 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzdtw\" (UniqueName: \"kubernetes.io/projected/d98e67ec-e732-4646-859f-5dcf61d03def-kube-api-access-xzdtw\") pod \"0702d9261541ca7ab33cc0f5bb569a6098e591a9e02db10dc12f9a2708fnb5w\" (UID: \"d98e67ec-e732-4646-859f-5dcf61d03def\") " pod="openstack-operators/0702d9261541ca7ab33cc0f5bb569a6098e591a9e02db10dc12f9a2708fnb5w" Nov 22 04:22:20 crc kubenswrapper[4699]: I1122 04:22:20.806366 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d98e67ec-e732-4646-859f-5dcf61d03def-bundle\") pod \"0702d9261541ca7ab33cc0f5bb569a6098e591a9e02db10dc12f9a2708fnb5w\" (UID: \"d98e67ec-e732-4646-859f-5dcf61d03def\") " pod="openstack-operators/0702d9261541ca7ab33cc0f5bb569a6098e591a9e02db10dc12f9a2708fnb5w" Nov 22 04:22:20 crc kubenswrapper[4699]: I1122 04:22:20.806413 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d98e67ec-e732-4646-859f-5dcf61d03def-util\") pod \"0702d9261541ca7ab33cc0f5bb569a6098e591a9e02db10dc12f9a2708fnb5w\" (UID: \"d98e67ec-e732-4646-859f-5dcf61d03def\") " pod="openstack-operators/0702d9261541ca7ab33cc0f5bb569a6098e591a9e02db10dc12f9a2708fnb5w" Nov 22 04:22:20 crc kubenswrapper[4699]: I1122 04:22:20.825737 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzdtw\" (UniqueName: \"kubernetes.io/projected/d98e67ec-e732-4646-859f-5dcf61d03def-kube-api-access-xzdtw\") pod \"0702d9261541ca7ab33cc0f5bb569a6098e591a9e02db10dc12f9a2708fnb5w\" (UID: \"d98e67ec-e732-4646-859f-5dcf61d03def\") " pod="openstack-operators/0702d9261541ca7ab33cc0f5bb569a6098e591a9e02db10dc12f9a2708fnb5w" Nov 22 04:22:20 crc kubenswrapper[4699]: I1122 04:22:20.963549 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0702d9261541ca7ab33cc0f5bb569a6098e591a9e02db10dc12f9a2708fnb5w" Nov 22 04:22:21 crc kubenswrapper[4699]: I1122 04:22:21.357207 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0702d9261541ca7ab33cc0f5bb569a6098e591a9e02db10dc12f9a2708fnb5w"] Nov 22 04:22:21 crc kubenswrapper[4699]: I1122 04:22:21.780807 4699 generic.go:334] "Generic (PLEG): container finished" podID="d98e67ec-e732-4646-859f-5dcf61d03def" containerID="bd6373eb677f76427dc8103acdaf9c0930dc0bd8b28cc27afc4acd22806b346b" exitCode=0 Nov 22 04:22:21 crc kubenswrapper[4699]: I1122 04:22:21.780915 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0702d9261541ca7ab33cc0f5bb569a6098e591a9e02db10dc12f9a2708fnb5w" event={"ID":"d98e67ec-e732-4646-859f-5dcf61d03def","Type":"ContainerDied","Data":"bd6373eb677f76427dc8103acdaf9c0930dc0bd8b28cc27afc4acd22806b346b"} Nov 22 04:22:21 crc kubenswrapper[4699]: I1122 04:22:21.781185 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0702d9261541ca7ab33cc0f5bb569a6098e591a9e02db10dc12f9a2708fnb5w" event={"ID":"d98e67ec-e732-4646-859f-5dcf61d03def","Type":"ContainerStarted","Data":"d3dca1529e9614e2005ea4dcc597e0720de9824ec283a1ee2e49a47eb538a280"} Nov 22 04:22:22 crc kubenswrapper[4699]: I1122 04:22:22.791098 4699 generic.go:334] "Generic (PLEG): container finished" podID="d98e67ec-e732-4646-859f-5dcf61d03def" containerID="e50d0591047440db472c46bb30b979a975ba42688b84bf12ff1e20ef5fed6df7" exitCode=0 Nov 22 04:22:22 crc kubenswrapper[4699]: I1122 04:22:22.791144 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0702d9261541ca7ab33cc0f5bb569a6098e591a9e02db10dc12f9a2708fnb5w" event={"ID":"d98e67ec-e732-4646-859f-5dcf61d03def","Type":"ContainerDied","Data":"e50d0591047440db472c46bb30b979a975ba42688b84bf12ff1e20ef5fed6df7"} Nov 22 04:22:23 crc kubenswrapper[4699]: I1122 04:22:23.798814 4699 generic.go:334] "Generic (PLEG): container finished" podID="d98e67ec-e732-4646-859f-5dcf61d03def" containerID="f5e2adee0b42f4bb9337d04f2e74b21c07d8fcf7ab359aebc2ebdc23ce6cda05" exitCode=0 Nov 22 04:22:23 crc kubenswrapper[4699]: I1122 04:22:23.798901 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0702d9261541ca7ab33cc0f5bb569a6098e591a9e02db10dc12f9a2708fnb5w" event={"ID":"d98e67ec-e732-4646-859f-5dcf61d03def","Type":"ContainerDied","Data":"f5e2adee0b42f4bb9337d04f2e74b21c07d8fcf7ab359aebc2ebdc23ce6cda05"} Nov 22 04:22:25 crc kubenswrapper[4699]: I1122 04:22:25.023053 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0702d9261541ca7ab33cc0f5bb569a6098e591a9e02db10dc12f9a2708fnb5w" Nov 22 04:22:25 crc kubenswrapper[4699]: I1122 04:22:25.161482 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d98e67ec-e732-4646-859f-5dcf61d03def-util\") pod \"d98e67ec-e732-4646-859f-5dcf61d03def\" (UID: \"d98e67ec-e732-4646-859f-5dcf61d03def\") " Nov 22 04:22:25 crc kubenswrapper[4699]: I1122 04:22:25.161573 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d98e67ec-e732-4646-859f-5dcf61d03def-bundle\") pod \"d98e67ec-e732-4646-859f-5dcf61d03def\" (UID: \"d98e67ec-e732-4646-859f-5dcf61d03def\") " Nov 22 04:22:25 crc kubenswrapper[4699]: I1122 04:22:25.161646 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzdtw\" (UniqueName: \"kubernetes.io/projected/d98e67ec-e732-4646-859f-5dcf61d03def-kube-api-access-xzdtw\") pod \"d98e67ec-e732-4646-859f-5dcf61d03def\" (UID: \"d98e67ec-e732-4646-859f-5dcf61d03def\") " Nov 22 04:22:25 crc kubenswrapper[4699]: I1122 04:22:25.162321 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d98e67ec-e732-4646-859f-5dcf61d03def-bundle" (OuterVolumeSpecName: "bundle") pod "d98e67ec-e732-4646-859f-5dcf61d03def" (UID: "d98e67ec-e732-4646-859f-5dcf61d03def"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:22:25 crc kubenswrapper[4699]: I1122 04:22:25.166642 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d98e67ec-e732-4646-859f-5dcf61d03def-kube-api-access-xzdtw" (OuterVolumeSpecName: "kube-api-access-xzdtw") pod "d98e67ec-e732-4646-859f-5dcf61d03def" (UID: "d98e67ec-e732-4646-859f-5dcf61d03def"). InnerVolumeSpecName "kube-api-access-xzdtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:22:25 crc kubenswrapper[4699]: I1122 04:22:25.175642 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d98e67ec-e732-4646-859f-5dcf61d03def-util" (OuterVolumeSpecName: "util") pod "d98e67ec-e732-4646-859f-5dcf61d03def" (UID: "d98e67ec-e732-4646-859f-5dcf61d03def"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:22:25 crc kubenswrapper[4699]: I1122 04:22:25.263119 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzdtw\" (UniqueName: \"kubernetes.io/projected/d98e67ec-e732-4646-859f-5dcf61d03def-kube-api-access-xzdtw\") on node \"crc\" DevicePath \"\"" Nov 22 04:22:25 crc kubenswrapper[4699]: I1122 04:22:25.263166 4699 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d98e67ec-e732-4646-859f-5dcf61d03def-util\") on node \"crc\" DevicePath \"\"" Nov 22 04:22:25 crc kubenswrapper[4699]: I1122 04:22:25.263177 4699 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d98e67ec-e732-4646-859f-5dcf61d03def-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:22:25 crc kubenswrapper[4699]: I1122 04:22:25.815026 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0702d9261541ca7ab33cc0f5bb569a6098e591a9e02db10dc12f9a2708fnb5w" event={"ID":"d98e67ec-e732-4646-859f-5dcf61d03def","Type":"ContainerDied","Data":"d3dca1529e9614e2005ea4dcc597e0720de9824ec283a1ee2e49a47eb538a280"} Nov 22 04:22:25 crc kubenswrapper[4699]: I1122 04:22:25.815089 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3dca1529e9614e2005ea4dcc597e0720de9824ec283a1ee2e49a47eb538a280" Nov 22 04:22:25 crc kubenswrapper[4699]: I1122 04:22:25.815119 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0702d9261541ca7ab33cc0f5bb569a6098e591a9e02db10dc12f9a2708fnb5w" Nov 22 04:22:33 crc kubenswrapper[4699]: I1122 04:22:33.424098 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-655bc68c75-ttb9l"] Nov 22 04:22:33 crc kubenswrapper[4699]: E1122 04:22:33.424838 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d98e67ec-e732-4646-859f-5dcf61d03def" containerName="extract" Nov 22 04:22:33 crc kubenswrapper[4699]: I1122 04:22:33.424851 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="d98e67ec-e732-4646-859f-5dcf61d03def" containerName="extract" Nov 22 04:22:33 crc kubenswrapper[4699]: E1122 04:22:33.424862 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d98e67ec-e732-4646-859f-5dcf61d03def" containerName="pull" Nov 22 04:22:33 crc kubenswrapper[4699]: I1122 04:22:33.424870 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="d98e67ec-e732-4646-859f-5dcf61d03def" containerName="pull" Nov 22 04:22:33 crc kubenswrapper[4699]: E1122 04:22:33.424968 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d98e67ec-e732-4646-859f-5dcf61d03def" containerName="util" Nov 22 04:22:33 crc kubenswrapper[4699]: I1122 04:22:33.424976 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="d98e67ec-e732-4646-859f-5dcf61d03def" containerName="util" Nov 22 04:22:33 crc kubenswrapper[4699]: I1122 04:22:33.425082 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="d98e67ec-e732-4646-859f-5dcf61d03def" containerName="extract" Nov 22 04:22:33 crc kubenswrapper[4699]: I1122 04:22:33.425807 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-655bc68c75-ttb9l" Nov 22 04:22:33 crc kubenswrapper[4699]: I1122 04:22:33.429169 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-p8sr7" Nov 22 04:22:33 crc kubenswrapper[4699]: I1122 04:22:33.454475 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-655bc68c75-ttb9l"] Nov 22 04:22:33 crc kubenswrapper[4699]: I1122 04:22:33.570548 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmkst\" (UniqueName: \"kubernetes.io/projected/3a16ffb8-61fd-4e05-ac7b-277eb20e4f4c-kube-api-access-pmkst\") pod \"openstack-operator-controller-operator-655bc68c75-ttb9l\" (UID: \"3a16ffb8-61fd-4e05-ac7b-277eb20e4f4c\") " pod="openstack-operators/openstack-operator-controller-operator-655bc68c75-ttb9l" Nov 22 04:22:33 crc kubenswrapper[4699]: I1122 04:22:33.671412 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmkst\" (UniqueName: \"kubernetes.io/projected/3a16ffb8-61fd-4e05-ac7b-277eb20e4f4c-kube-api-access-pmkst\") pod \"openstack-operator-controller-operator-655bc68c75-ttb9l\" (UID: \"3a16ffb8-61fd-4e05-ac7b-277eb20e4f4c\") " pod="openstack-operators/openstack-operator-controller-operator-655bc68c75-ttb9l" Nov 22 04:22:33 crc kubenswrapper[4699]: I1122 04:22:33.689554 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmkst\" (UniqueName: \"kubernetes.io/projected/3a16ffb8-61fd-4e05-ac7b-277eb20e4f4c-kube-api-access-pmkst\") pod \"openstack-operator-controller-operator-655bc68c75-ttb9l\" (UID: \"3a16ffb8-61fd-4e05-ac7b-277eb20e4f4c\") " pod="openstack-operators/openstack-operator-controller-operator-655bc68c75-ttb9l" Nov 22 04:22:33 crc kubenswrapper[4699]: I1122 04:22:33.743229 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-655bc68c75-ttb9l" Nov 22 04:22:33 crc kubenswrapper[4699]: I1122 04:22:33.967942 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-655bc68c75-ttb9l"] Nov 22 04:22:34 crc kubenswrapper[4699]: I1122 04:22:34.351009 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jzx5b"] Nov 22 04:22:34 crc kubenswrapper[4699]: I1122 04:22:34.353209 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jzx5b" Nov 22 04:22:34 crc kubenswrapper[4699]: I1122 04:22:34.369286 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jzx5b"] Nov 22 04:22:34 crc kubenswrapper[4699]: I1122 04:22:34.390204 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwfzh\" (UniqueName: \"kubernetes.io/projected/4a3ccfea-186c-4ca3-835c-8109de4831c2-kube-api-access-xwfzh\") pod \"certified-operators-jzx5b\" (UID: \"4a3ccfea-186c-4ca3-835c-8109de4831c2\") " pod="openshift-marketplace/certified-operators-jzx5b" Nov 22 04:22:34 crc kubenswrapper[4699]: I1122 04:22:34.390254 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a3ccfea-186c-4ca3-835c-8109de4831c2-utilities\") pod \"certified-operators-jzx5b\" (UID: \"4a3ccfea-186c-4ca3-835c-8109de4831c2\") " pod="openshift-marketplace/certified-operators-jzx5b" Nov 22 04:22:34 crc kubenswrapper[4699]: I1122 04:22:34.390281 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a3ccfea-186c-4ca3-835c-8109de4831c2-catalog-content\") pod \"certified-operators-jzx5b\" (UID: \"4a3ccfea-186c-4ca3-835c-8109de4831c2\") " pod="openshift-marketplace/certified-operators-jzx5b" Nov 22 04:22:34 crc kubenswrapper[4699]: I1122 04:22:34.491697 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwfzh\" (UniqueName: \"kubernetes.io/projected/4a3ccfea-186c-4ca3-835c-8109de4831c2-kube-api-access-xwfzh\") pod \"certified-operators-jzx5b\" (UID: \"4a3ccfea-186c-4ca3-835c-8109de4831c2\") " pod="openshift-marketplace/certified-operators-jzx5b" Nov 22 04:22:34 crc kubenswrapper[4699]: I1122 04:22:34.491755 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a3ccfea-186c-4ca3-835c-8109de4831c2-utilities\") pod \"certified-operators-jzx5b\" (UID: \"4a3ccfea-186c-4ca3-835c-8109de4831c2\") " pod="openshift-marketplace/certified-operators-jzx5b" Nov 22 04:22:34 crc kubenswrapper[4699]: I1122 04:22:34.491792 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a3ccfea-186c-4ca3-835c-8109de4831c2-catalog-content\") pod \"certified-operators-jzx5b\" (UID: \"4a3ccfea-186c-4ca3-835c-8109de4831c2\") " pod="openshift-marketplace/certified-operators-jzx5b" Nov 22 04:22:34 crc kubenswrapper[4699]: I1122 04:22:34.493340 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a3ccfea-186c-4ca3-835c-8109de4831c2-catalog-content\") pod \"certified-operators-jzx5b\" (UID: \"4a3ccfea-186c-4ca3-835c-8109de4831c2\") " pod="openshift-marketplace/certified-operators-jzx5b" Nov 22 04:22:34 crc kubenswrapper[4699]: I1122 04:22:34.493822 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a3ccfea-186c-4ca3-835c-8109de4831c2-utilities\") pod \"certified-operators-jzx5b\" (UID: \"4a3ccfea-186c-4ca3-835c-8109de4831c2\") " pod="openshift-marketplace/certified-operators-jzx5b" Nov 22 04:22:34 crc kubenswrapper[4699]: I1122 04:22:34.516175 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwfzh\" (UniqueName: \"kubernetes.io/projected/4a3ccfea-186c-4ca3-835c-8109de4831c2-kube-api-access-xwfzh\") pod \"certified-operators-jzx5b\" (UID: \"4a3ccfea-186c-4ca3-835c-8109de4831c2\") " pod="openshift-marketplace/certified-operators-jzx5b" Nov 22 04:22:34 crc kubenswrapper[4699]: I1122 04:22:34.687504 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jzx5b" Nov 22 04:22:34 crc kubenswrapper[4699]: I1122 04:22:34.893915 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-655bc68c75-ttb9l" event={"ID":"3a16ffb8-61fd-4e05-ac7b-277eb20e4f4c","Type":"ContainerStarted","Data":"c7e8a6093f9d3b063afb8daba832bf9b2b88b7c88a0a23619da96f9ba485e145"} Nov 22 04:22:34 crc kubenswrapper[4699]: I1122 04:22:34.991610 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jzx5b"] Nov 22 04:22:35 crc kubenswrapper[4699]: I1122 04:22:35.908339 4699 generic.go:334] "Generic (PLEG): container finished" podID="4a3ccfea-186c-4ca3-835c-8109de4831c2" containerID="5f527d697fb4f56e2f924f113826c40eb75c58939c529683307b09a2876eaf3a" exitCode=0 Nov 22 04:22:35 crc kubenswrapper[4699]: I1122 04:22:35.908547 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jzx5b" event={"ID":"4a3ccfea-186c-4ca3-835c-8109de4831c2","Type":"ContainerDied","Data":"5f527d697fb4f56e2f924f113826c40eb75c58939c529683307b09a2876eaf3a"} Nov 22 04:22:35 crc kubenswrapper[4699]: I1122 04:22:35.909641 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jzx5b" event={"ID":"4a3ccfea-186c-4ca3-835c-8109de4831c2","Type":"ContainerStarted","Data":"f6f3a45d2ed667c66499af01a5c7fd7b2892cf5ea429040a371c80380bad895a"} Nov 22 04:22:38 crc kubenswrapper[4699]: I1122 04:22:38.927416 4699 generic.go:334] "Generic (PLEG): container finished" podID="4a3ccfea-186c-4ca3-835c-8109de4831c2" containerID="97483feeb678b7312eb5ac8f75081a8fe52605fec4c8a53d4efd8ba74eb4b4a1" exitCode=0 Nov 22 04:22:38 crc kubenswrapper[4699]: I1122 04:22:38.927598 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jzx5b" event={"ID":"4a3ccfea-186c-4ca3-835c-8109de4831c2","Type":"ContainerDied","Data":"97483feeb678b7312eb5ac8f75081a8fe52605fec4c8a53d4efd8ba74eb4b4a1"} Nov 22 04:22:38 crc kubenswrapper[4699]: I1122 04:22:38.930214 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-655bc68c75-ttb9l" event={"ID":"3a16ffb8-61fd-4e05-ac7b-277eb20e4f4c","Type":"ContainerStarted","Data":"7f9c7864025ee82fd95ee7cf7ffc58a4a8da325af3632e337706611957abd819"} Nov 22 04:22:39 crc kubenswrapper[4699]: I1122 04:22:39.937222 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jzx5b" event={"ID":"4a3ccfea-186c-4ca3-835c-8109de4831c2","Type":"ContainerStarted","Data":"3e73cdf0ccdac7e507daa2869b1e4210e894a1b717f73c0f700f0ca7f38d5d85"} Nov 22 04:22:39 crc kubenswrapper[4699]: I1122 04:22:39.970088 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jzx5b" podStartSLOduration=3.011954464 podStartE2EDuration="5.970074057s" podCreationTimestamp="2025-11-22 04:22:34 +0000 UTC" firstStartedPulling="2025-11-22 04:22:36.488463513 +0000 UTC m=+907.831084700" lastFinishedPulling="2025-11-22 04:22:39.446583096 +0000 UTC m=+910.789204293" observedRunningTime="2025-11-22 04:22:39.962625577 +0000 UTC m=+911.305246774" watchObservedRunningTime="2025-11-22 04:22:39.970074057 +0000 UTC m=+911.312695244" Nov 22 04:22:41 crc kubenswrapper[4699]: I1122 04:22:41.950062 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-655bc68c75-ttb9l" event={"ID":"3a16ffb8-61fd-4e05-ac7b-277eb20e4f4c","Type":"ContainerStarted","Data":"936f68d9e8c90ba3dfc7e157de3c0b3707cefa3e15c907a0ff582b549596f62c"} Nov 22 04:22:41 crc kubenswrapper[4699]: I1122 04:22:41.950679 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-655bc68c75-ttb9l" Nov 22 04:22:41 crc kubenswrapper[4699]: I1122 04:22:41.981245 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-655bc68c75-ttb9l" podStartSLOduration=1.90629381 podStartE2EDuration="8.981227227s" podCreationTimestamp="2025-11-22 04:22:33 +0000 UTC" firstStartedPulling="2025-11-22 04:22:33.974673798 +0000 UTC m=+905.317294985" lastFinishedPulling="2025-11-22 04:22:41.049607215 +0000 UTC m=+912.392228402" observedRunningTime="2025-11-22 04:22:41.978672405 +0000 UTC m=+913.321293612" watchObservedRunningTime="2025-11-22 04:22:41.981227227 +0000 UTC m=+913.323848414" Nov 22 04:22:43 crc kubenswrapper[4699]: I1122 04:22:43.746322 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-655bc68c75-ttb9l" Nov 22 04:22:44 crc kubenswrapper[4699]: I1122 04:22:44.688450 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jzx5b" Nov 22 04:22:44 crc kubenswrapper[4699]: I1122 04:22:44.688829 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jzx5b" Nov 22 04:22:44 crc kubenswrapper[4699]: I1122 04:22:44.727217 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jzx5b" Nov 22 04:22:45 crc kubenswrapper[4699]: I1122 04:22:45.001909 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jzx5b" Nov 22 04:22:46 crc kubenswrapper[4699]: I1122 04:22:46.133013 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jzx5b"] Nov 22 04:22:46 crc kubenswrapper[4699]: I1122 04:22:46.977455 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jzx5b" podUID="4a3ccfea-186c-4ca3-835c-8109de4831c2" containerName="registry-server" containerID="cri-o://3e73cdf0ccdac7e507daa2869b1e4210e894a1b717f73c0f700f0ca7f38d5d85" gracePeriod=2 Nov 22 04:22:47 crc kubenswrapper[4699]: I1122 04:22:47.405137 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jzx5b" Nov 22 04:22:47 crc kubenswrapper[4699]: I1122 04:22:47.568233 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a3ccfea-186c-4ca3-835c-8109de4831c2-catalog-content\") pod \"4a3ccfea-186c-4ca3-835c-8109de4831c2\" (UID: \"4a3ccfea-186c-4ca3-835c-8109de4831c2\") " Nov 22 04:22:47 crc kubenswrapper[4699]: I1122 04:22:47.568311 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwfzh\" (UniqueName: \"kubernetes.io/projected/4a3ccfea-186c-4ca3-835c-8109de4831c2-kube-api-access-xwfzh\") pod \"4a3ccfea-186c-4ca3-835c-8109de4831c2\" (UID: \"4a3ccfea-186c-4ca3-835c-8109de4831c2\") " Nov 22 04:22:47 crc kubenswrapper[4699]: I1122 04:22:47.568370 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a3ccfea-186c-4ca3-835c-8109de4831c2-utilities\") pod \"4a3ccfea-186c-4ca3-835c-8109de4831c2\" (UID: \"4a3ccfea-186c-4ca3-835c-8109de4831c2\") " Nov 22 04:22:47 crc kubenswrapper[4699]: I1122 04:22:47.570559 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a3ccfea-186c-4ca3-835c-8109de4831c2-utilities" (OuterVolumeSpecName: "utilities") pod "4a3ccfea-186c-4ca3-835c-8109de4831c2" (UID: "4a3ccfea-186c-4ca3-835c-8109de4831c2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:22:47 crc kubenswrapper[4699]: I1122 04:22:47.574561 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a3ccfea-186c-4ca3-835c-8109de4831c2-kube-api-access-xwfzh" (OuterVolumeSpecName: "kube-api-access-xwfzh") pod "4a3ccfea-186c-4ca3-835c-8109de4831c2" (UID: "4a3ccfea-186c-4ca3-835c-8109de4831c2"). InnerVolumeSpecName "kube-api-access-xwfzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:22:47 crc kubenswrapper[4699]: I1122 04:22:47.618410 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a3ccfea-186c-4ca3-835c-8109de4831c2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4a3ccfea-186c-4ca3-835c-8109de4831c2" (UID: "4a3ccfea-186c-4ca3-835c-8109de4831c2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:22:47 crc kubenswrapper[4699]: I1122 04:22:47.670671 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwfzh\" (UniqueName: \"kubernetes.io/projected/4a3ccfea-186c-4ca3-835c-8109de4831c2-kube-api-access-xwfzh\") on node \"crc\" DevicePath \"\"" Nov 22 04:22:47 crc kubenswrapper[4699]: I1122 04:22:47.670986 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a3ccfea-186c-4ca3-835c-8109de4831c2-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:22:47 crc kubenswrapper[4699]: I1122 04:22:47.671082 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a3ccfea-186c-4ca3-835c-8109de4831c2-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:22:48 crc kubenswrapper[4699]: I1122 04:22:48.003742 4699 generic.go:334] "Generic (PLEG): container finished" podID="4a3ccfea-186c-4ca3-835c-8109de4831c2" containerID="3e73cdf0ccdac7e507daa2869b1e4210e894a1b717f73c0f700f0ca7f38d5d85" exitCode=0 Nov 22 04:22:48 crc kubenswrapper[4699]: I1122 04:22:48.003799 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jzx5b" event={"ID":"4a3ccfea-186c-4ca3-835c-8109de4831c2","Type":"ContainerDied","Data":"3e73cdf0ccdac7e507daa2869b1e4210e894a1b717f73c0f700f0ca7f38d5d85"} Nov 22 04:22:48 crc kubenswrapper[4699]: I1122 04:22:48.003829 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jzx5b" event={"ID":"4a3ccfea-186c-4ca3-835c-8109de4831c2","Type":"ContainerDied","Data":"f6f3a45d2ed667c66499af01a5c7fd7b2892cf5ea429040a371c80380bad895a"} Nov 22 04:22:48 crc kubenswrapper[4699]: I1122 04:22:48.003859 4699 scope.go:117] "RemoveContainer" containerID="3e73cdf0ccdac7e507daa2869b1e4210e894a1b717f73c0f700f0ca7f38d5d85" Nov 22 04:22:48 crc kubenswrapper[4699]: I1122 04:22:48.004052 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jzx5b" Nov 22 04:22:48 crc kubenswrapper[4699]: I1122 04:22:48.045636 4699 scope.go:117] "RemoveContainer" containerID="97483feeb678b7312eb5ac8f75081a8fe52605fec4c8a53d4efd8ba74eb4b4a1" Nov 22 04:22:48 crc kubenswrapper[4699]: I1122 04:22:48.065560 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jzx5b"] Nov 22 04:22:48 crc kubenswrapper[4699]: I1122 04:22:48.065614 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jzx5b"] Nov 22 04:22:48 crc kubenswrapper[4699]: I1122 04:22:48.080895 4699 scope.go:117] "RemoveContainer" containerID="5f527d697fb4f56e2f924f113826c40eb75c58939c529683307b09a2876eaf3a" Nov 22 04:22:48 crc kubenswrapper[4699]: I1122 04:22:48.105738 4699 scope.go:117] "RemoveContainer" containerID="3e73cdf0ccdac7e507daa2869b1e4210e894a1b717f73c0f700f0ca7f38d5d85" Nov 22 04:22:48 crc kubenswrapper[4699]: E1122 04:22:48.106304 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e73cdf0ccdac7e507daa2869b1e4210e894a1b717f73c0f700f0ca7f38d5d85\": container with ID starting with 3e73cdf0ccdac7e507daa2869b1e4210e894a1b717f73c0f700f0ca7f38d5d85 not found: ID does not exist" containerID="3e73cdf0ccdac7e507daa2869b1e4210e894a1b717f73c0f700f0ca7f38d5d85" Nov 22 04:22:48 crc kubenswrapper[4699]: I1122 04:22:48.106417 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e73cdf0ccdac7e507daa2869b1e4210e894a1b717f73c0f700f0ca7f38d5d85"} err="failed to get container status \"3e73cdf0ccdac7e507daa2869b1e4210e894a1b717f73c0f700f0ca7f38d5d85\": rpc error: code = NotFound desc = could not find container \"3e73cdf0ccdac7e507daa2869b1e4210e894a1b717f73c0f700f0ca7f38d5d85\": container with ID starting with 3e73cdf0ccdac7e507daa2869b1e4210e894a1b717f73c0f700f0ca7f38d5d85 not found: ID does not exist" Nov 22 04:22:48 crc kubenswrapper[4699]: I1122 04:22:48.106555 4699 scope.go:117] "RemoveContainer" containerID="97483feeb678b7312eb5ac8f75081a8fe52605fec4c8a53d4efd8ba74eb4b4a1" Nov 22 04:22:48 crc kubenswrapper[4699]: E1122 04:22:48.106911 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97483feeb678b7312eb5ac8f75081a8fe52605fec4c8a53d4efd8ba74eb4b4a1\": container with ID starting with 97483feeb678b7312eb5ac8f75081a8fe52605fec4c8a53d4efd8ba74eb4b4a1 not found: ID does not exist" containerID="97483feeb678b7312eb5ac8f75081a8fe52605fec4c8a53d4efd8ba74eb4b4a1" Nov 22 04:22:48 crc kubenswrapper[4699]: I1122 04:22:48.106940 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97483feeb678b7312eb5ac8f75081a8fe52605fec4c8a53d4efd8ba74eb4b4a1"} err="failed to get container status \"97483feeb678b7312eb5ac8f75081a8fe52605fec4c8a53d4efd8ba74eb4b4a1\": rpc error: code = NotFound desc = could not find container \"97483feeb678b7312eb5ac8f75081a8fe52605fec4c8a53d4efd8ba74eb4b4a1\": container with ID starting with 97483feeb678b7312eb5ac8f75081a8fe52605fec4c8a53d4efd8ba74eb4b4a1 not found: ID does not exist" Nov 22 04:22:48 crc kubenswrapper[4699]: I1122 04:22:48.106959 4699 scope.go:117] "RemoveContainer" containerID="5f527d697fb4f56e2f924f113826c40eb75c58939c529683307b09a2876eaf3a" Nov 22 04:22:48 crc kubenswrapper[4699]: E1122 04:22:48.107228 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f527d697fb4f56e2f924f113826c40eb75c58939c529683307b09a2876eaf3a\": container with ID starting with 5f527d697fb4f56e2f924f113826c40eb75c58939c529683307b09a2876eaf3a not found: ID does not exist" containerID="5f527d697fb4f56e2f924f113826c40eb75c58939c529683307b09a2876eaf3a" Nov 22 04:22:48 crc kubenswrapper[4699]: I1122 04:22:48.107326 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f527d697fb4f56e2f924f113826c40eb75c58939c529683307b09a2876eaf3a"} err="failed to get container status \"5f527d697fb4f56e2f924f113826c40eb75c58939c529683307b09a2876eaf3a\": rpc error: code = NotFound desc = could not find container \"5f527d697fb4f56e2f924f113826c40eb75c58939c529683307b09a2876eaf3a\": container with ID starting with 5f527d697fb4f56e2f924f113826c40eb75c58939c529683307b09a2876eaf3a not found: ID does not exist" Nov 22 04:22:49 crc kubenswrapper[4699]: I1122 04:22:49.455092 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a3ccfea-186c-4ca3-835c-8109de4831c2" path="/var/lib/kubelet/pods/4a3ccfea-186c-4ca3-835c-8109de4831c2/volumes" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.553905 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-75fb479bcc-qcvls"] Nov 22 04:23:06 crc kubenswrapper[4699]: E1122 04:23:06.554963 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a3ccfea-186c-4ca3-835c-8109de4831c2" containerName="extract-utilities" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.554979 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a3ccfea-186c-4ca3-835c-8109de4831c2" containerName="extract-utilities" Nov 22 04:23:06 crc kubenswrapper[4699]: E1122 04:23:06.554991 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a3ccfea-186c-4ca3-835c-8109de4831c2" containerName="extract-content" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.555000 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a3ccfea-186c-4ca3-835c-8109de4831c2" containerName="extract-content" Nov 22 04:23:06 crc kubenswrapper[4699]: E1122 04:23:06.555015 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a3ccfea-186c-4ca3-835c-8109de4831c2" containerName="registry-server" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.555025 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a3ccfea-186c-4ca3-835c-8109de4831c2" containerName="registry-server" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.555180 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a3ccfea-186c-4ca3-835c-8109de4831c2" containerName="registry-server" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.556037 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-qcvls" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.558311 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-cc6gw" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.570697 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-75fb479bcc-qcvls"] Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.574233 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6498cbf48f-pq7wj"] Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.575159 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6498cbf48f-pq7wj" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.581685 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-7zrpb" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.595399 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-767ccfd65f-6fd98"] Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.596457 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-6fd98" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.599865 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-gpgsf" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.601591 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6498cbf48f-pq7wj"] Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.606533 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-7969689c84-thd4s"] Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.607393 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7969689c84-thd4s" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.609566 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-6m5k4" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.622764 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-767ccfd65f-6fd98"] Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.622925 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trq6n\" (UniqueName: \"kubernetes.io/projected/b4a26451-a994-4295-b354-46babc06a258-kube-api-access-trq6n\") pod \"designate-operator-controller-manager-767ccfd65f-6fd98\" (UID: \"b4a26451-a994-4295-b354-46babc06a258\") " pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-6fd98" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.622984 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r58q\" (UniqueName: \"kubernetes.io/projected/18c1e29a-63b8-4973-92a9-87c5b0301565-kube-api-access-2r58q\") pod \"glance-operator-controller-manager-7969689c84-thd4s\" (UID: \"18c1e29a-63b8-4973-92a9-87c5b0301565\") " pod="openstack-operators/glance-operator-controller-manager-7969689c84-thd4s" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.623117 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2jpw\" (UniqueName: \"kubernetes.io/projected/455f990d-3a21-4c84-8a9d-e4a4af10c47f-kube-api-access-z2jpw\") pod \"cinder-operator-controller-manager-6498cbf48f-pq7wj\" (UID: \"455f990d-3a21-4c84-8a9d-e4a4af10c47f\") " pod="openstack-operators/cinder-operator-controller-manager-6498cbf48f-pq7wj" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.623151 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nthhg\" (UniqueName: \"kubernetes.io/projected/5cda144e-7465-4060-945a-89e3d288c551-kube-api-access-nthhg\") pod \"barbican-operator-controller-manager-75fb479bcc-qcvls\" (UID: \"5cda144e-7465-4060-945a-89e3d288c551\") " pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-qcvls" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.633334 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-56f54d6746-k9wzx"] Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.634224 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-56f54d6746-k9wzx" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.639792 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-cbhqz" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.673023 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7969689c84-thd4s"] Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.683117 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-56f54d6746-k9wzx"] Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.713544 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-598f69df5d-kd4w8"] Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.714749 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-kd4w8" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.718762 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-rmxgj" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.724075 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2jpw\" (UniqueName: \"kubernetes.io/projected/455f990d-3a21-4c84-8a9d-e4a4af10c47f-kube-api-access-z2jpw\") pod \"cinder-operator-controller-manager-6498cbf48f-pq7wj\" (UID: \"455f990d-3a21-4c84-8a9d-e4a4af10c47f\") " pod="openstack-operators/cinder-operator-controller-manager-6498cbf48f-pq7wj" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.724128 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mp8qh\" (UniqueName: \"kubernetes.io/projected/d0b96c5e-2c37-4f90-b933-2b8f8cbdfaf3-kube-api-access-mp8qh\") pod \"heat-operator-controller-manager-56f54d6746-k9wzx\" (UID: \"d0b96c5e-2c37-4f90-b933-2b8f8cbdfaf3\") " pod="openstack-operators/heat-operator-controller-manager-56f54d6746-k9wzx" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.724150 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nthhg\" (UniqueName: \"kubernetes.io/projected/5cda144e-7465-4060-945a-89e3d288c551-kube-api-access-nthhg\") pod \"barbican-operator-controller-manager-75fb479bcc-qcvls\" (UID: \"5cda144e-7465-4060-945a-89e3d288c551\") " pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-qcvls" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.724184 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trq6n\" (UniqueName: \"kubernetes.io/projected/b4a26451-a994-4295-b354-46babc06a258-kube-api-access-trq6n\") pod \"designate-operator-controller-manager-767ccfd65f-6fd98\" (UID: \"b4a26451-a994-4295-b354-46babc06a258\") " pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-6fd98" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.724201 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2r58q\" (UniqueName: \"kubernetes.io/projected/18c1e29a-63b8-4973-92a9-87c5b0301565-kube-api-access-2r58q\") pod \"glance-operator-controller-manager-7969689c84-thd4s\" (UID: \"18c1e29a-63b8-4973-92a9-87c5b0301565\") " pod="openstack-operators/glance-operator-controller-manager-7969689c84-thd4s" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.724258 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx85h\" (UniqueName: \"kubernetes.io/projected/34a9f105-8024-4cc0-9ad2-14029731110d-kube-api-access-zx85h\") pod \"horizon-operator-controller-manager-598f69df5d-kd4w8\" (UID: \"34a9f105-8024-4cc0-9ad2-14029731110d\") " pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-kd4w8" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.741403 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7875d8bb94-q9xzz"] Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.742780 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7875d8bb94-q9xzz" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.744980 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.745163 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-5z7dm" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.757115 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5d95d484b9-g8rz2"] Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.758359 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5d95d484b9-g8rz2" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.768856 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-xgv5p" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.769922 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nthhg\" (UniqueName: \"kubernetes.io/projected/5cda144e-7465-4060-945a-89e3d288c551-kube-api-access-nthhg\") pod \"barbican-operator-controller-manager-75fb479bcc-qcvls\" (UID: \"5cda144e-7465-4060-945a-89e3d288c551\") " pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-qcvls" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.780884 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trq6n\" (UniqueName: \"kubernetes.io/projected/b4a26451-a994-4295-b354-46babc06a258-kube-api-access-trq6n\") pod \"designate-operator-controller-manager-767ccfd65f-6fd98\" (UID: \"b4a26451-a994-4295-b354-46babc06a258\") " pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-6fd98" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.786788 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r58q\" (UniqueName: \"kubernetes.io/projected/18c1e29a-63b8-4973-92a9-87c5b0301565-kube-api-access-2r58q\") pod \"glance-operator-controller-manager-7969689c84-thd4s\" (UID: \"18c1e29a-63b8-4973-92a9-87c5b0301565\") " pod="openstack-operators/glance-operator-controller-manager-7969689c84-thd4s" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.789654 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2jpw\" (UniqueName: \"kubernetes.io/projected/455f990d-3a21-4c84-8a9d-e4a4af10c47f-kube-api-access-z2jpw\") pod \"cinder-operator-controller-manager-6498cbf48f-pq7wj\" (UID: \"455f990d-3a21-4c84-8a9d-e4a4af10c47f\") " pod="openstack-operators/cinder-operator-controller-manager-6498cbf48f-pq7wj" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.801357 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-598f69df5d-kd4w8"] Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.812166 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5d95d484b9-g8rz2"] Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.825492 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7875d8bb94-q9xzz"] Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.826507 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g7jf\" (UniqueName: \"kubernetes.io/projected/aaa73391-c097-4428-a43d-a5a4c1469419-kube-api-access-5g7jf\") pod \"ironic-operator-controller-manager-5d95d484b9-g8rz2\" (UID: \"aaa73391-c097-4428-a43d-a5a4c1469419\") " pod="openstack-operators/ironic-operator-controller-manager-5d95d484b9-g8rz2" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.826548 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c6fc85a2-ca9a-4d87-811a-4ff8fcdcf477-cert\") pod \"infra-operator-controller-manager-7875d8bb94-q9xzz\" (UID: \"c6fc85a2-ca9a-4d87-811a-4ff8fcdcf477\") " pod="openstack-operators/infra-operator-controller-manager-7875d8bb94-q9xzz" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.826573 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zx85h\" (UniqueName: \"kubernetes.io/projected/34a9f105-8024-4cc0-9ad2-14029731110d-kube-api-access-zx85h\") pod \"horizon-operator-controller-manager-598f69df5d-kd4w8\" (UID: \"34a9f105-8024-4cc0-9ad2-14029731110d\") " pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-kd4w8" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.826604 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mp8qh\" (UniqueName: \"kubernetes.io/projected/d0b96c5e-2c37-4f90-b933-2b8f8cbdfaf3-kube-api-access-mp8qh\") pod \"heat-operator-controller-manager-56f54d6746-k9wzx\" (UID: \"d0b96c5e-2c37-4f90-b933-2b8f8cbdfaf3\") " pod="openstack-operators/heat-operator-controller-manager-56f54d6746-k9wzx" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.826653 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhnfc\" (UniqueName: \"kubernetes.io/projected/c6fc85a2-ca9a-4d87-811a-4ff8fcdcf477-kube-api-access-bhnfc\") pod \"infra-operator-controller-manager-7875d8bb94-q9xzz\" (UID: \"c6fc85a2-ca9a-4d87-811a-4ff8fcdcf477\") " pod="openstack-operators/infra-operator-controller-manager-7875d8bb94-q9xzz" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.831081 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7454b96578-2dh9g"] Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.832211 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7454b96578-2dh9g" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.837116 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-twc9q" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.846111 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-58f887965d-qsszz"] Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.846670 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mp8qh\" (UniqueName: \"kubernetes.io/projected/d0b96c5e-2c37-4f90-b933-2b8f8cbdfaf3-kube-api-access-mp8qh\") pod \"heat-operator-controller-manager-56f54d6746-k9wzx\" (UID: \"d0b96c5e-2c37-4f90-b933-2b8f8cbdfaf3\") " pod="openstack-operators/heat-operator-controller-manager-56f54d6746-k9wzx" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.847582 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-58f887965d-qsszz" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.851134 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-vnzdr" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.854144 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7454b96578-2dh9g"] Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.862138 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx85h\" (UniqueName: \"kubernetes.io/projected/34a9f105-8024-4cc0-9ad2-14029731110d-kube-api-access-zx85h\") pod \"horizon-operator-controller-manager-598f69df5d-kd4w8\" (UID: \"34a9f105-8024-4cc0-9ad2-14029731110d\") " pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-kd4w8" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.873946 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-54b5986bb8-zd2r8"] Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.874996 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-zd2r8" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.877651 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-58f887965d-qsszz"] Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.880359 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-qcvls" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.890440 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-54b5986bb8-zd2r8"] Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.892149 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-wt2wl" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.896234 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-78bd47f458-v289m"] Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.897171 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-v289m" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.899230 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-cqgqv" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.901781 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6498cbf48f-pq7wj" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.929083 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5g7jf\" (UniqueName: \"kubernetes.io/projected/aaa73391-c097-4428-a43d-a5a4c1469419-kube-api-access-5g7jf\") pod \"ironic-operator-controller-manager-5d95d484b9-g8rz2\" (UID: \"aaa73391-c097-4428-a43d-a5a4c1469419\") " pod="openstack-operators/ironic-operator-controller-manager-5d95d484b9-g8rz2" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.929141 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c6fc85a2-ca9a-4d87-811a-4ff8fcdcf477-cert\") pod \"infra-operator-controller-manager-7875d8bb94-q9xzz\" (UID: \"c6fc85a2-ca9a-4d87-811a-4ff8fcdcf477\") " pod="openstack-operators/infra-operator-controller-manager-7875d8bb94-q9xzz" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.929187 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhnfc\" (UniqueName: \"kubernetes.io/projected/c6fc85a2-ca9a-4d87-811a-4ff8fcdcf477-kube-api-access-bhnfc\") pod \"infra-operator-controller-manager-7875d8bb94-q9xzz\" (UID: \"c6fc85a2-ca9a-4d87-811a-4ff8fcdcf477\") " pod="openstack-operators/infra-operator-controller-manager-7875d8bb94-q9xzz" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.932015 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-6fd98" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.942212 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7969689c84-thd4s" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.943016 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c6fc85a2-ca9a-4d87-811a-4ff8fcdcf477-cert\") pod \"infra-operator-controller-manager-7875d8bb94-q9xzz\" (UID: \"c6fc85a2-ca9a-4d87-811a-4ff8fcdcf477\") " pod="openstack-operators/infra-operator-controller-manager-7875d8bb94-q9xzz" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.946839 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-78bd47f458-v289m"] Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.956218 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-cfbb9c588-qlcpr"] Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.957345 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-qlcpr" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.966151 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhnfc\" (UniqueName: \"kubernetes.io/projected/c6fc85a2-ca9a-4d87-811a-4ff8fcdcf477-kube-api-access-bhnfc\") pod \"infra-operator-controller-manager-7875d8bb94-q9xzz\" (UID: \"c6fc85a2-ca9a-4d87-811a-4ff8fcdcf477\") " pod="openstack-operators/infra-operator-controller-manager-7875d8bb94-q9xzz" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.969876 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-vgflm" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.971307 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-56f54d6746-k9wzx" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.976926 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g7jf\" (UniqueName: \"kubernetes.io/projected/aaa73391-c097-4428-a43d-a5a4c1469419-kube-api-access-5g7jf\") pod \"ironic-operator-controller-manager-5d95d484b9-g8rz2\" (UID: \"aaa73391-c097-4428-a43d-a5a4c1469419\") " pod="openstack-operators/ironic-operator-controller-manager-5d95d484b9-g8rz2" Nov 22 04:23:06 crc kubenswrapper[4699]: I1122 04:23:06.981745 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-cfbb9c588-qlcpr"] Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.001923 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-7rck7"] Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.003019 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-7rck7" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.004988 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-kj7hc" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.013404 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-7rck7"] Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.034103 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7flch\" (UniqueName: \"kubernetes.io/projected/aca6ad44-aa04-4178-ab59-bfdec68e49e7-kube-api-access-7flch\") pod \"mariadb-operator-controller-manager-54b5986bb8-zd2r8\" (UID: \"aca6ad44-aa04-4178-ab59-bfdec68e49e7\") " pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-zd2r8" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.034145 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgjwb\" (UniqueName: \"kubernetes.io/projected/91f34dc0-cd1e-4f25-90e8-cbcb4ae2d930-kube-api-access-bgjwb\") pod \"neutron-operator-controller-manager-78bd47f458-v289m\" (UID: \"91f34dc0-cd1e-4f25-90e8-cbcb4ae2d930\") " pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-v289m" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.034556 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-kd4w8" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.037093 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7vxq\" (UniqueName: \"kubernetes.io/projected/8ea3fa32-8451-4f8a-b395-98ce1382e116-kube-api-access-m7vxq\") pod \"keystone-operator-controller-manager-7454b96578-2dh9g\" (UID: \"8ea3fa32-8451-4f8a-b395-98ce1382e116\") " pod="openstack-operators/keystone-operator-controller-manager-7454b96578-2dh9g" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.037168 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5596m\" (UniqueName: \"kubernetes.io/projected/072faef9-c4a0-4bf9-84a8-fadca8945449-kube-api-access-5596m\") pod \"manila-operator-controller-manager-58f887965d-qsszz\" (UID: \"072faef9-c4a0-4bf9-84a8-fadca8945449\") " pod="openstack-operators/manila-operator-controller-manager-58f887965d-qsszz" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.053970 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-54fc5f65b7-j5b4z"] Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.055585 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-j5b4z" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.057571 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7875d8bb94-q9xzz" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.057706 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-q2f62" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.062979 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-qcxmj"] Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.063964 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-qcxmj" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.067843 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-f4mcw" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.068565 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.082574 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b797b8dff-fpcpg"] Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.084002 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-fpcpg" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.086414 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-5dmrs" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.111763 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-qcxmj"] Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.120066 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-d656998f4-fcjt6"] Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.121113 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-d656998f4-fcjt6" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.122544 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-h4n92" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.126765 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-54fc5f65b7-j5b4z"] Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.145636 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5d95d484b9-g8rz2" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.147061 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgjwb\" (UniqueName: \"kubernetes.io/projected/91f34dc0-cd1e-4f25-90e8-cbcb4ae2d930-kube-api-access-bgjwb\") pod \"neutron-operator-controller-manager-78bd47f458-v289m\" (UID: \"91f34dc0-cd1e-4f25-90e8-cbcb4ae2d930\") " pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-v289m" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.147127 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7vxq\" (UniqueName: \"kubernetes.io/projected/8ea3fa32-8451-4f8a-b395-98ce1382e116-kube-api-access-m7vxq\") pod \"keystone-operator-controller-manager-7454b96578-2dh9g\" (UID: \"8ea3fa32-8451-4f8a-b395-98ce1382e116\") " pod="openstack-operators/keystone-operator-controller-manager-7454b96578-2dh9g" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.147168 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xdvs\" (UniqueName: \"kubernetes.io/projected/36905f20-0246-46f5-921a-2d18b2db8bdd-kube-api-access-5xdvs\") pod \"octavia-operator-controller-manager-54cfbf4c7d-7rck7\" (UID: \"36905f20-0246-46f5-921a-2d18b2db8bdd\") " pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-7rck7" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.147201 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5596m\" (UniqueName: \"kubernetes.io/projected/072faef9-c4a0-4bf9-84a8-fadca8945449-kube-api-access-5596m\") pod \"manila-operator-controller-manager-58f887965d-qsszz\" (UID: \"072faef9-c4a0-4bf9-84a8-fadca8945449\") " pod="openstack-operators/manila-operator-controller-manager-58f887965d-qsszz" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.147238 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r4kg\" (UniqueName: \"kubernetes.io/projected/e8f34ea0-681d-4a19-b9c9-0c230a7261e3-kube-api-access-7r4kg\") pod \"nova-operator-controller-manager-cfbb9c588-qlcpr\" (UID: \"e8f34ea0-681d-4a19-b9c9-0c230a7261e3\") " pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-qlcpr" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.147256 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7flch\" (UniqueName: \"kubernetes.io/projected/aca6ad44-aa04-4178-ab59-bfdec68e49e7-kube-api-access-7flch\") pod \"mariadb-operator-controller-manager-54b5986bb8-zd2r8\" (UID: \"aca6ad44-aa04-4178-ab59-bfdec68e49e7\") " pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-zd2r8" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.152750 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b797b8dff-fpcpg"] Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.171961 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-d656998f4-fcjt6"] Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.172221 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgjwb\" (UniqueName: \"kubernetes.io/projected/91f34dc0-cd1e-4f25-90e8-cbcb4ae2d930-kube-api-access-bgjwb\") pod \"neutron-operator-controller-manager-78bd47f458-v289m\" (UID: \"91f34dc0-cd1e-4f25-90e8-cbcb4ae2d930\") " pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-v289m" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.188348 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7flch\" (UniqueName: \"kubernetes.io/projected/aca6ad44-aa04-4178-ab59-bfdec68e49e7-kube-api-access-7flch\") pod \"mariadb-operator-controller-manager-54b5986bb8-zd2r8\" (UID: \"aca6ad44-aa04-4178-ab59-bfdec68e49e7\") " pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-zd2r8" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.193406 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-lxncl"] Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.196165 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-lxncl" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.197112 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7vxq\" (UniqueName: \"kubernetes.io/projected/8ea3fa32-8451-4f8a-b395-98ce1382e116-kube-api-access-m7vxq\") pod \"keystone-operator-controller-manager-7454b96578-2dh9g\" (UID: \"8ea3fa32-8451-4f8a-b395-98ce1382e116\") " pod="openstack-operators/keystone-operator-controller-manager-7454b96578-2dh9g" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.201726 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-c7xrq" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.204898 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-lxncl"] Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.205910 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5596m\" (UniqueName: \"kubernetes.io/projected/072faef9-c4a0-4bf9-84a8-fadca8945449-kube-api-access-5596m\") pod \"manila-operator-controller-manager-58f887965d-qsszz\" (UID: \"072faef9-c4a0-4bf9-84a8-fadca8945449\") " pod="openstack-operators/manila-operator-controller-manager-58f887965d-qsszz" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.217846 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7454b96578-2dh9g" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.259033 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-b4c496f69-j5w8t"] Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.263757 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfr74\" (UniqueName: \"kubernetes.io/projected/a67d0761-3d62-4e25-80bc-cf6fac86cf0b-kube-api-access-sfr74\") pod \"swift-operator-controller-manager-d656998f4-fcjt6\" (UID: \"a67d0761-3d62-4e25-80bc-cf6fac86cf0b\") " pod="openstack-operators/swift-operator-controller-manager-d656998f4-fcjt6" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.263813 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt9hm\" (UniqueName: \"kubernetes.io/projected/913c840a-25a6-46f9-bd06-e379438a5292-kube-api-access-rt9hm\") pod \"openstack-baremetal-operator-controller-manager-8c7444f48-qcxmj\" (UID: \"913c840a-25a6-46f9-bd06-e379438a5292\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-qcxmj" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.263875 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7r4kg\" (UniqueName: \"kubernetes.io/projected/e8f34ea0-681d-4a19-b9c9-0c230a7261e3-kube-api-access-7r4kg\") pod \"nova-operator-controller-manager-cfbb9c588-qlcpr\" (UID: \"e8f34ea0-681d-4a19-b9c9-0c230a7261e3\") " pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-qlcpr" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.263933 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f6lg\" (UniqueName: \"kubernetes.io/projected/7199810d-9e13-4ed5-a4bc-46c874551678-kube-api-access-2f6lg\") pod \"ovn-operator-controller-manager-54fc5f65b7-j5b4z\" (UID: \"7199810d-9e13-4ed5-a4bc-46c874551678\") " pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-j5b4z" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.263964 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/913c840a-25a6-46f9-bd06-e379438a5292-cert\") pod \"openstack-baremetal-operator-controller-manager-8c7444f48-qcxmj\" (UID: \"913c840a-25a6-46f9-bd06-e379438a5292\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-qcxmj" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.263985 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xdvs\" (UniqueName: \"kubernetes.io/projected/36905f20-0246-46f5-921a-2d18b2db8bdd-kube-api-access-5xdvs\") pod \"octavia-operator-controller-manager-54cfbf4c7d-7rck7\" (UID: \"36905f20-0246-46f5-921a-2d18b2db8bdd\") " pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-7rck7" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.264004 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltwbq\" (UniqueName: \"kubernetes.io/projected/0669449f-4e6b-4ab7-90ee-f8d93286db7a-kube-api-access-ltwbq\") pod \"placement-operator-controller-manager-5b797b8dff-fpcpg\" (UID: \"0669449f-4e6b-4ab7-90ee-f8d93286db7a\") " pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-fpcpg" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.264340 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-b4c496f69-j5w8t" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.270405 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-b4c496f69-j5w8t"] Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.270828 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-58f887965d-qsszz" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.279903 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-m6sfz" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.290851 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-zd2r8" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.300612 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r4kg\" (UniqueName: \"kubernetes.io/projected/e8f34ea0-681d-4a19-b9c9-0c230a7261e3-kube-api-access-7r4kg\") pod \"nova-operator-controller-manager-cfbb9c588-qlcpr\" (UID: \"e8f34ea0-681d-4a19-b9c9-0c230a7261e3\") " pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-qlcpr" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.309139 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-v289m" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.315711 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-qlcpr" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.346011 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-8c6448b9f-t4rhb"] Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.348078 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-t4rhb" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.350126 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-95p48" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.355726 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-8c6448b9f-t4rhb"] Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.356649 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xdvs\" (UniqueName: \"kubernetes.io/projected/36905f20-0246-46f5-921a-2d18b2db8bdd-kube-api-access-5xdvs\") pod \"octavia-operator-controller-manager-54cfbf4c7d-7rck7\" (UID: \"36905f20-0246-46f5-921a-2d18b2db8bdd\") " pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-7rck7" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.366117 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2f6lg\" (UniqueName: \"kubernetes.io/projected/7199810d-9e13-4ed5-a4bc-46c874551678-kube-api-access-2f6lg\") pod \"ovn-operator-controller-manager-54fc5f65b7-j5b4z\" (UID: \"7199810d-9e13-4ed5-a4bc-46c874551678\") " pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-j5b4z" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.366183 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/913c840a-25a6-46f9-bd06-e379438a5292-cert\") pod \"openstack-baremetal-operator-controller-manager-8c7444f48-qcxmj\" (UID: \"913c840a-25a6-46f9-bd06-e379438a5292\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-qcxmj" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.366226 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltwbq\" (UniqueName: \"kubernetes.io/projected/0669449f-4e6b-4ab7-90ee-f8d93286db7a-kube-api-access-ltwbq\") pod \"placement-operator-controller-manager-5b797b8dff-fpcpg\" (UID: \"0669449f-4e6b-4ab7-90ee-f8d93286db7a\") " pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-fpcpg" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.366293 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdpxz\" (UniqueName: \"kubernetes.io/projected/1cf7d81b-c0df-48d7-9b01-b7185a803ac6-kube-api-access-fdpxz\") pod \"test-operator-controller-manager-b4c496f69-j5w8t\" (UID: \"1cf7d81b-c0df-48d7-9b01-b7185a803ac6\") " pod="openstack-operators/test-operator-controller-manager-b4c496f69-j5w8t" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.366335 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfr74\" (UniqueName: \"kubernetes.io/projected/a67d0761-3d62-4e25-80bc-cf6fac86cf0b-kube-api-access-sfr74\") pod \"swift-operator-controller-manager-d656998f4-fcjt6\" (UID: \"a67d0761-3d62-4e25-80bc-cf6fac86cf0b\") " pod="openstack-operators/swift-operator-controller-manager-d656998f4-fcjt6" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.366354 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtnbs\" (UniqueName: \"kubernetes.io/projected/0dc61afc-07cb-46af-afb8-4c0bf3bc84f0-kube-api-access-mtnbs\") pod \"telemetry-operator-controller-manager-6d4bf84b58-lxncl\" (UID: \"0dc61afc-07cb-46af-afb8-4c0bf3bc84f0\") " pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-lxncl" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.366399 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rt9hm\" (UniqueName: \"kubernetes.io/projected/913c840a-25a6-46f9-bd06-e379438a5292-kube-api-access-rt9hm\") pod \"openstack-baremetal-operator-controller-manager-8c7444f48-qcxmj\" (UID: \"913c840a-25a6-46f9-bd06-e379438a5292\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-qcxmj" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.366478 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh4pf\" (UniqueName: \"kubernetes.io/projected/482b57cb-741a-4062-9479-2a41febc67af-kube-api-access-bh4pf\") pod \"watcher-operator-controller-manager-8c6448b9f-t4rhb\" (UID: \"482b57cb-741a-4062-9479-2a41febc67af\") " pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-t4rhb" Nov 22 04:23:07 crc kubenswrapper[4699]: E1122 04:23:07.366899 4699 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 22 04:23:07 crc kubenswrapper[4699]: E1122 04:23:07.366945 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/913c840a-25a6-46f9-bd06-e379438a5292-cert podName:913c840a-25a6-46f9-bd06-e379438a5292 nodeName:}" failed. No retries permitted until 2025-11-22 04:23:07.866928379 +0000 UTC m=+939.209549566 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/913c840a-25a6-46f9-bd06-e379438a5292-cert") pod "openstack-baremetal-operator-controller-manager-8c7444f48-qcxmj" (UID: "913c840a-25a6-46f9-bd06-e379438a5292") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.383965 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfr74\" (UniqueName: \"kubernetes.io/projected/a67d0761-3d62-4e25-80bc-cf6fac86cf0b-kube-api-access-sfr74\") pod \"swift-operator-controller-manager-d656998f4-fcjt6\" (UID: \"a67d0761-3d62-4e25-80bc-cf6fac86cf0b\") " pod="openstack-operators/swift-operator-controller-manager-d656998f4-fcjt6" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.387985 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt9hm\" (UniqueName: \"kubernetes.io/projected/913c840a-25a6-46f9-bd06-e379438a5292-kube-api-access-rt9hm\") pod \"openstack-baremetal-operator-controller-manager-8c7444f48-qcxmj\" (UID: \"913c840a-25a6-46f9-bd06-e379438a5292\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-qcxmj" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.388385 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-66b5f67bb4-9h4ls"] Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.388714 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f6lg\" (UniqueName: \"kubernetes.io/projected/7199810d-9e13-4ed5-a4bc-46c874551678-kube-api-access-2f6lg\") pod \"ovn-operator-controller-manager-54fc5f65b7-j5b4z\" (UID: \"7199810d-9e13-4ed5-a4bc-46c874551678\") " pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-j5b4z" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.390252 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-66b5f67bb4-9h4ls" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.392047 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-ntvz8" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.392073 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.394086 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-66b5f67bb4-9h4ls"] Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.400920 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-j5b4z" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.403259 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltwbq\" (UniqueName: \"kubernetes.io/projected/0669449f-4e6b-4ab7-90ee-f8d93286db7a-kube-api-access-ltwbq\") pod \"placement-operator-controller-manager-5b797b8dff-fpcpg\" (UID: \"0669449f-4e6b-4ab7-90ee-f8d93286db7a\") " pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-fpcpg" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.404029 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rhx7c"] Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.404962 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rhx7c" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.406748 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-h85hg" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.408897 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rhx7c"] Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.469086 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdpxz\" (UniqueName: \"kubernetes.io/projected/1cf7d81b-c0df-48d7-9b01-b7185a803ac6-kube-api-access-fdpxz\") pod \"test-operator-controller-manager-b4c496f69-j5w8t\" (UID: \"1cf7d81b-c0df-48d7-9b01-b7185a803ac6\") " pod="openstack-operators/test-operator-controller-manager-b4c496f69-j5w8t" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.469502 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtnbs\" (UniqueName: \"kubernetes.io/projected/0dc61afc-07cb-46af-afb8-4c0bf3bc84f0-kube-api-access-mtnbs\") pod \"telemetry-operator-controller-manager-6d4bf84b58-lxncl\" (UID: \"0dc61afc-07cb-46af-afb8-4c0bf3bc84f0\") " pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-lxncl" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.469585 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bh4pf\" (UniqueName: \"kubernetes.io/projected/482b57cb-741a-4062-9479-2a41febc67af-kube-api-access-bh4pf\") pod \"watcher-operator-controller-manager-8c6448b9f-t4rhb\" (UID: \"482b57cb-741a-4062-9479-2a41febc67af\") " pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-t4rhb" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.479972 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-fpcpg" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.481533 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-d656998f4-fcjt6" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.526104 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdpxz\" (UniqueName: \"kubernetes.io/projected/1cf7d81b-c0df-48d7-9b01-b7185a803ac6-kube-api-access-fdpxz\") pod \"test-operator-controller-manager-b4c496f69-j5w8t\" (UID: \"1cf7d81b-c0df-48d7-9b01-b7185a803ac6\") " pod="openstack-operators/test-operator-controller-manager-b4c496f69-j5w8t" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.526512 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh4pf\" (UniqueName: \"kubernetes.io/projected/482b57cb-741a-4062-9479-2a41febc67af-kube-api-access-bh4pf\") pod \"watcher-operator-controller-manager-8c6448b9f-t4rhb\" (UID: \"482b57cb-741a-4062-9479-2a41febc67af\") " pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-t4rhb" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.526992 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtnbs\" (UniqueName: \"kubernetes.io/projected/0dc61afc-07cb-46af-afb8-4c0bf3bc84f0-kube-api-access-mtnbs\") pod \"telemetry-operator-controller-manager-6d4bf84b58-lxncl\" (UID: \"0dc61afc-07cb-46af-afb8-4c0bf3bc84f0\") " pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-lxncl" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.531275 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-lxncl" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.570772 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdtdp\" (UniqueName: \"kubernetes.io/projected/4fb724ba-7502-41eb-aab0-40eacbcd652e-kube-api-access-xdtdp\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-rhx7c\" (UID: \"4fb724ba-7502-41eb-aab0-40eacbcd652e\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rhx7c" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.570854 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/07e128b3-5973-437b-b7ec-80177dacf14f-cert\") pod \"openstack-operator-controller-manager-66b5f67bb4-9h4ls\" (UID: \"07e128b3-5973-437b-b7ec-80177dacf14f\") " pod="openstack-operators/openstack-operator-controller-manager-66b5f67bb4-9h4ls" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.570972 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86xlh\" (UniqueName: \"kubernetes.io/projected/07e128b3-5973-437b-b7ec-80177dacf14f-kube-api-access-86xlh\") pod \"openstack-operator-controller-manager-66b5f67bb4-9h4ls\" (UID: \"07e128b3-5973-437b-b7ec-80177dacf14f\") " pod="openstack-operators/openstack-operator-controller-manager-66b5f67bb4-9h4ls" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.625801 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-7rck7" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.672298 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6498cbf48f-pq7wj"] Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.672781 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/07e128b3-5973-437b-b7ec-80177dacf14f-cert\") pod \"openstack-operator-controller-manager-66b5f67bb4-9h4ls\" (UID: \"07e128b3-5973-437b-b7ec-80177dacf14f\") " pod="openstack-operators/openstack-operator-controller-manager-66b5f67bb4-9h4ls" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.672930 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86xlh\" (UniqueName: \"kubernetes.io/projected/07e128b3-5973-437b-b7ec-80177dacf14f-kube-api-access-86xlh\") pod \"openstack-operator-controller-manager-66b5f67bb4-9h4ls\" (UID: \"07e128b3-5973-437b-b7ec-80177dacf14f\") " pod="openstack-operators/openstack-operator-controller-manager-66b5f67bb4-9h4ls" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.672974 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdtdp\" (UniqueName: \"kubernetes.io/projected/4fb724ba-7502-41eb-aab0-40eacbcd652e-kube-api-access-xdtdp\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-rhx7c\" (UID: \"4fb724ba-7502-41eb-aab0-40eacbcd652e\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rhx7c" Nov 22 04:23:07 crc kubenswrapper[4699]: E1122 04:23:07.673374 4699 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 22 04:23:07 crc kubenswrapper[4699]: E1122 04:23:07.673418 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07e128b3-5973-437b-b7ec-80177dacf14f-cert podName:07e128b3-5973-437b-b7ec-80177dacf14f nodeName:}" failed. No retries permitted until 2025-11-22 04:23:08.173405311 +0000 UTC m=+939.516026498 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/07e128b3-5973-437b-b7ec-80177dacf14f-cert") pod "openstack-operator-controller-manager-66b5f67bb4-9h4ls" (UID: "07e128b3-5973-437b-b7ec-80177dacf14f") : secret "webhook-server-cert" not found Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.694327 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86xlh\" (UniqueName: \"kubernetes.io/projected/07e128b3-5973-437b-b7ec-80177dacf14f-kube-api-access-86xlh\") pod \"openstack-operator-controller-manager-66b5f67bb4-9h4ls\" (UID: \"07e128b3-5973-437b-b7ec-80177dacf14f\") " pod="openstack-operators/openstack-operator-controller-manager-66b5f67bb4-9h4ls" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.697450 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdtdp\" (UniqueName: \"kubernetes.io/projected/4fb724ba-7502-41eb-aab0-40eacbcd652e-kube-api-access-xdtdp\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-rhx7c\" (UID: \"4fb724ba-7502-41eb-aab0-40eacbcd652e\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rhx7c" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.738081 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-b4c496f69-j5w8t" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.772396 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-75fb479bcc-qcvls"] Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.774202 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-t4rhb" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.879255 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/913c840a-25a6-46f9-bd06-e379438a5292-cert\") pod \"openstack-baremetal-operator-controller-manager-8c7444f48-qcxmj\" (UID: \"913c840a-25a6-46f9-bd06-e379438a5292\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-qcxmj" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.887358 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/913c840a-25a6-46f9-bd06-e379438a5292-cert\") pod \"openstack-baremetal-operator-controller-manager-8c7444f48-qcxmj\" (UID: \"913c840a-25a6-46f9-bd06-e379438a5292\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-qcxmj" Nov 22 04:23:07 crc kubenswrapper[4699]: I1122 04:23:07.896884 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rhx7c" Nov 22 04:23:08 crc kubenswrapper[4699]: I1122 04:23:08.028987 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-qcxmj" Nov 22 04:23:08 crc kubenswrapper[4699]: I1122 04:23:08.152238 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6498cbf48f-pq7wj" event={"ID":"455f990d-3a21-4c84-8a9d-e4a4af10c47f","Type":"ContainerStarted","Data":"a624658dee68567311b453492d9a6ab729ceae4536eba5cee456f7f7f5907e2a"} Nov 22 04:23:08 crc kubenswrapper[4699]: I1122 04:23:08.155197 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-qcvls" event={"ID":"5cda144e-7465-4060-945a-89e3d288c551","Type":"ContainerStarted","Data":"990e43a70004d375a6a3fd912f8fe5e3927c3e85b7cd95c31cb53dfe67ae9035"} Nov 22 04:23:08 crc kubenswrapper[4699]: I1122 04:23:08.187547 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/07e128b3-5973-437b-b7ec-80177dacf14f-cert\") pod \"openstack-operator-controller-manager-66b5f67bb4-9h4ls\" (UID: \"07e128b3-5973-437b-b7ec-80177dacf14f\") " pod="openstack-operators/openstack-operator-controller-manager-66b5f67bb4-9h4ls" Nov 22 04:23:08 crc kubenswrapper[4699]: I1122 04:23:08.192366 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/07e128b3-5973-437b-b7ec-80177dacf14f-cert\") pod \"openstack-operator-controller-manager-66b5f67bb4-9h4ls\" (UID: \"07e128b3-5973-437b-b7ec-80177dacf14f\") " pod="openstack-operators/openstack-operator-controller-manager-66b5f67bb4-9h4ls" Nov 22 04:23:08 crc kubenswrapper[4699]: I1122 04:23:08.214350 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7969689c84-thd4s"] Nov 22 04:23:08 crc kubenswrapper[4699]: W1122 04:23:08.230486 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4a26451_a994_4295_b354_46babc06a258.slice/crio-d6a46b5536432e845f531f09d5d5b1102b9b3ba4d40cdd110b1a7744a69a882b WatchSource:0}: Error finding container d6a46b5536432e845f531f09d5d5b1102b9b3ba4d40cdd110b1a7744a69a882b: Status 404 returned error can't find the container with id d6a46b5536432e845f531f09d5d5b1102b9b3ba4d40cdd110b1a7744a69a882b Nov 22 04:23:08 crc kubenswrapper[4699]: I1122 04:23:08.231809 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-767ccfd65f-6fd98"] Nov 22 04:23:08 crc kubenswrapper[4699]: W1122 04:23:08.241688 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6fc85a2_ca9a_4d87_811a_4ff8fcdcf477.slice/crio-01e97ff05435d2802503165c945112d54d45616da1d09f3f3e4a07798feb8310 WatchSource:0}: Error finding container 01e97ff05435d2802503165c945112d54d45616da1d09f3f3e4a07798feb8310: Status 404 returned error can't find the container with id 01e97ff05435d2802503165c945112d54d45616da1d09f3f3e4a07798feb8310 Nov 22 04:23:08 crc kubenswrapper[4699]: I1122 04:23:08.247339 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7875d8bb94-q9xzz"] Nov 22 04:23:08 crc kubenswrapper[4699]: W1122 04:23:08.249469 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0b96c5e_2c37_4f90_b933_2b8f8cbdfaf3.slice/crio-5badf6a235fa39718a612c8f195a5492c2da5b9c4d85a9656434a0295e8b837b WatchSource:0}: Error finding container 5badf6a235fa39718a612c8f195a5492c2da5b9c4d85a9656434a0295e8b837b: Status 404 returned error can't find the container with id 5badf6a235fa39718a612c8f195a5492c2da5b9c4d85a9656434a0295e8b837b Nov 22 04:23:08 crc kubenswrapper[4699]: I1122 04:23:08.252603 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-598f69df5d-kd4w8"] Nov 22 04:23:08 crc kubenswrapper[4699]: I1122 04:23:08.261739 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-56f54d6746-k9wzx"] Nov 22 04:23:08 crc kubenswrapper[4699]: I1122 04:23:08.477844 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-66b5f67bb4-9h4ls" Nov 22 04:23:08 crc kubenswrapper[4699]: I1122 04:23:08.618216 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-lxncl"] Nov 22 04:23:08 crc kubenswrapper[4699]: I1122 04:23:08.651520 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-54fc5f65b7-j5b4z"] Nov 22 04:23:08 crc kubenswrapper[4699]: I1122 04:23:08.667330 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7454b96578-2dh9g"] Nov 22 04:23:08 crc kubenswrapper[4699]: W1122 04:23:08.695730 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0dc61afc_07cb_46af_afb8_4c0bf3bc84f0.slice/crio-3b206c561da344f9e53d37ee2504355e6ca5b277b9cb7ded501b68e62cc2a525 WatchSource:0}: Error finding container 3b206c561da344f9e53d37ee2504355e6ca5b277b9cb7ded501b68e62cc2a525: Status 404 returned error can't find the container with id 3b206c561da344f9e53d37ee2504355e6ca5b277b9cb7ded501b68e62cc2a525 Nov 22 04:23:08 crc kubenswrapper[4699]: I1122 04:23:08.711629 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rhx7c"] Nov 22 04:23:08 crc kubenswrapper[4699]: W1122 04:23:08.717851 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ea3fa32_8451_4f8a_b395_98ce1382e116.slice/crio-13046eff5afc3b0bfad403220165afb960cddffee3f7d666c4fa7a1a6aebbad8 WatchSource:0}: Error finding container 13046eff5afc3b0bfad403220165afb960cddffee3f7d666c4fa7a1a6aebbad8: Status 404 returned error can't find the container with id 13046eff5afc3b0bfad403220165afb960cddffee3f7d666c4fa7a1a6aebbad8 Nov 22 04:23:08 crc kubenswrapper[4699]: W1122 04:23:08.718333 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod482b57cb_741a_4062_9479_2a41febc67af.slice/crio-26a11f3376095d8aae6b10002e69e1fd6c61c4d2cee026ebf2e2b2fa71f29342 WatchSource:0}: Error finding container 26a11f3376095d8aae6b10002e69e1fd6c61c4d2cee026ebf2e2b2fa71f29342: Status 404 returned error can't find the container with id 26a11f3376095d8aae6b10002e69e1fd6c61c4d2cee026ebf2e2b2fa71f29342 Nov 22 04:23:08 crc kubenswrapper[4699]: I1122 04:23:08.719125 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-7rck7"] Nov 22 04:23:08 crc kubenswrapper[4699]: I1122 04:23:08.729042 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-8c6448b9f-t4rhb"] Nov 22 04:23:08 crc kubenswrapper[4699]: I1122 04:23:08.741885 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-54b5986bb8-zd2r8"] Nov 22 04:23:08 crc kubenswrapper[4699]: W1122 04:23:08.744861 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fb724ba_7502_41eb_aab0_40eacbcd652e.slice/crio-da27239d754651580d4db1011685e0c79d92641c6eb02e02465da87387c77e75 WatchSource:0}: Error finding container da27239d754651580d4db1011685e0c79d92641c6eb02e02465da87387c77e75: Status 404 returned error can't find the container with id da27239d754651580d4db1011685e0c79d92641c6eb02e02465da87387c77e75 Nov 22 04:23:08 crc kubenswrapper[4699]: W1122 04:23:08.747583 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36905f20_0246_46f5_921a_2d18b2db8bdd.slice/crio-9326a26b00f9ab00e0da1e4b7008ff57dfe12c8358009f307eba5dba6f8ece59 WatchSource:0}: Error finding container 9326a26b00f9ab00e0da1e4b7008ff57dfe12c8358009f307eba5dba6f8ece59: Status 404 returned error can't find the container with id 9326a26b00f9ab00e0da1e4b7008ff57dfe12c8358009f307eba5dba6f8ece59 Nov 22 04:23:08 crc kubenswrapper[4699]: E1122 04:23:08.748833 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xdtdp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-rhx7c_openstack-operators(4fb724ba-7502-41eb-aab0-40eacbcd652e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 22 04:23:08 crc kubenswrapper[4699]: I1122 04:23:08.748903 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-d656998f4-fcjt6"] Nov 22 04:23:08 crc kubenswrapper[4699]: E1122 04:23:08.750155 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rhx7c" podUID="4fb724ba-7502-41eb-aab0-40eacbcd652e" Nov 22 04:23:08 crc kubenswrapper[4699]: E1122 04:23:08.750963 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:442c269d79163f8da75505019c02e9f0815837aaadcaddacb8e6c12df297ca13,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5xdvs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-54cfbf4c7d-7rck7_openstack-operators(36905f20-0246-46f5-921a-2d18b2db8bdd): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 22 04:23:08 crc kubenswrapper[4699]: I1122 04:23:08.753414 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-b4c496f69-j5w8t"] Nov 22 04:23:08 crc kubenswrapper[4699]: W1122 04:23:08.753855 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8f34ea0_681d_4a19_b9c9_0c230a7261e3.slice/crio-83983e41e6bdf42951db584998bfa151c3882e5ebfb72930125e9c5b9dca8905 WatchSource:0}: Error finding container 83983e41e6bdf42951db584998bfa151c3882e5ebfb72930125e9c5b9dca8905: Status 404 returned error can't find the container with id 83983e41e6bdf42951db584998bfa151c3882e5ebfb72930125e9c5b9dca8905 Nov 22 04:23:08 crc kubenswrapper[4699]: E1122 04:23:08.755781 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:82207e753574d4be246f86c4b074500d66cf20214aa80f0a8525cf3287a35e6d,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fdpxz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-b4c496f69-j5w8t_openstack-operators(1cf7d81b-c0df-48d7-9b01-b7185a803ac6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 22 04:23:08 crc kubenswrapper[4699]: I1122 04:23:08.758890 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b797b8dff-fpcpg"] Nov 22 04:23:08 crc kubenswrapper[4699]: E1122 04:23:08.761469 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:c053e34316044f14929e16e4f0d97f9f1b24cb68b5e22b925ca74c66aaaed0a7,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7r4kg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-cfbb9c588-qlcpr_openstack-operators(e8f34ea0-681d-4a19-b9c9-0c230a7261e3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 22 04:23:08 crc kubenswrapper[4699]: I1122 04:23:08.774797 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-58f887965d-qsszz"] Nov 22 04:23:08 crc kubenswrapper[4699]: E1122 04:23:08.778007 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:7b90521b9e9cb4eb43c2f1c3bf85dbd068d684315f4f705b07708dd078df9d04,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7flch,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-54b5986bb8-zd2r8_openstack-operators(aca6ad44-aa04-4178-ab59-bfdec68e49e7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 22 04:23:08 crc kubenswrapper[4699]: E1122 04:23:08.792609 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:c0b5f124a37c1538042c0e63f0978429572e2a851d7f3a6eb80de09b86d755a0,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sfr74,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-d656998f4-fcjt6_openstack-operators(a67d0761-3d62-4e25-80bc-cf6fac86cf0b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 22 04:23:08 crc kubenswrapper[4699]: E1122 04:23:08.793129 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:b749a5dd8bc718875c3f5e81b38d54d003be77ab92de4a3e9f9595566496a58a,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5596m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-58f887965d-qsszz_openstack-operators(072faef9-c4a0-4bf9-84a8-fadca8945449): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 22 04:23:08 crc kubenswrapper[4699]: I1122 04:23:08.793645 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-cfbb9c588-qlcpr"] Nov 22 04:23:08 crc kubenswrapper[4699]: I1122 04:23:08.838589 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-78bd47f458-v289m"] Nov 22 04:23:08 crc kubenswrapper[4699]: I1122 04:23:08.843011 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5d95d484b9-g8rz2"] Nov 22 04:23:08 crc kubenswrapper[4699]: I1122 04:23:08.848147 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-66b5f67bb4-9h4ls"] Nov 22 04:23:08 crc kubenswrapper[4699]: I1122 04:23:08.897321 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-qcxmj"] Nov 22 04:23:08 crc kubenswrapper[4699]: W1122 04:23:08.926609 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod913c840a_25a6_46f9_bd06_e379438a5292.slice/crio-9d3b18d547acbc060eb3f7d527d28244a8fec8d270ecbbb3040e1e0c2a1c074e WatchSource:0}: Error finding container 9d3b18d547acbc060eb3f7d527d28244a8fec8d270ecbbb3040e1e0c2a1c074e: Status 404 returned error can't find the container with id 9d3b18d547acbc060eb3f7d527d28244a8fec8d270ecbbb3040e1e0c2a1c074e Nov 22 04:23:09 crc kubenswrapper[4699]: E1122 04:23:09.124527 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-7rck7" podUID="36905f20-0246-46f5-921a-2d18b2db8bdd" Nov 22 04:23:09 crc kubenswrapper[4699]: E1122 04:23:09.132712 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-b4c496f69-j5w8t" podUID="1cf7d81b-c0df-48d7-9b01-b7185a803ac6" Nov 22 04:23:09 crc kubenswrapper[4699]: E1122 04:23:09.144957 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-qlcpr" podUID="e8f34ea0-681d-4a19-b9c9-0c230a7261e3" Nov 22 04:23:09 crc kubenswrapper[4699]: I1122 04:23:09.167659 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-qlcpr" event={"ID":"e8f34ea0-681d-4a19-b9c9-0c230a7261e3","Type":"ContainerStarted","Data":"ab0a645e4bdd4cbd4fc525659bb88846ffb5c606d69746ddec5816a2da6e720e"} Nov 22 04:23:09 crc kubenswrapper[4699]: I1122 04:23:09.167708 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-qlcpr" event={"ID":"e8f34ea0-681d-4a19-b9c9-0c230a7261e3","Type":"ContainerStarted","Data":"83983e41e6bdf42951db584998bfa151c3882e5ebfb72930125e9c5b9dca8905"} Nov 22 04:23:09 crc kubenswrapper[4699]: I1122 04:23:09.169628 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-zd2r8" event={"ID":"aca6ad44-aa04-4178-ab59-bfdec68e49e7","Type":"ContainerStarted","Data":"881a8523c45aa58ed84e413ddd2fe4db017f2e9eaa5a0046f68419c3774aab7a"} Nov 22 04:23:09 crc kubenswrapper[4699]: E1122 04:23:09.186691 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:c053e34316044f14929e16e4f0d97f9f1b24cb68b5e22b925ca74c66aaaed0a7\\\"\"" pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-qlcpr" podUID="e8f34ea0-681d-4a19-b9c9-0c230a7261e3" Nov 22 04:23:09 crc kubenswrapper[4699]: I1122 04:23:09.210848 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-v289m" event={"ID":"91f34dc0-cd1e-4f25-90e8-cbcb4ae2d930","Type":"ContainerStarted","Data":"82dfc8177eac54630e95876ba6d087cc133722b30c3badce12d11dab5e991e88"} Nov 22 04:23:09 crc kubenswrapper[4699]: I1122 04:23:09.212080 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rhx7c" event={"ID":"4fb724ba-7502-41eb-aab0-40eacbcd652e","Type":"ContainerStarted","Data":"da27239d754651580d4db1011685e0c79d92641c6eb02e02465da87387c77e75"} Nov 22 04:23:09 crc kubenswrapper[4699]: E1122 04:23:09.222800 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rhx7c" podUID="4fb724ba-7502-41eb-aab0-40eacbcd652e" Nov 22 04:23:09 crc kubenswrapper[4699]: I1122 04:23:09.223935 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-t4rhb" event={"ID":"482b57cb-741a-4062-9479-2a41febc67af","Type":"ContainerStarted","Data":"26a11f3376095d8aae6b10002e69e1fd6c61c4d2cee026ebf2e2b2fa71f29342"} Nov 22 04:23:09 crc kubenswrapper[4699]: I1122 04:23:09.245408 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-6fd98" event={"ID":"b4a26451-a994-4295-b354-46babc06a258","Type":"ContainerStarted","Data":"d6a46b5536432e845f531f09d5d5b1102b9b3ba4d40cdd110b1a7744a69a882b"} Nov 22 04:23:09 crc kubenswrapper[4699]: E1122 04:23:09.265854 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-d656998f4-fcjt6" podUID="a67d0761-3d62-4e25-80bc-cf6fac86cf0b" Nov 22 04:23:09 crc kubenswrapper[4699]: I1122 04:23:09.271746 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-d656998f4-fcjt6" event={"ID":"a67d0761-3d62-4e25-80bc-cf6fac86cf0b","Type":"ContainerStarted","Data":"c3800a13a32b944c41c7a4b49e3927ca92b53eabf050a037b6e67d5387a48bc7"} Nov 22 04:23:09 crc kubenswrapper[4699]: I1122 04:23:09.301742 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-56f54d6746-k9wzx" event={"ID":"d0b96c5e-2c37-4f90-b933-2b8f8cbdfaf3","Type":"ContainerStarted","Data":"5badf6a235fa39718a612c8f195a5492c2da5b9c4d85a9656434a0295e8b837b"} Nov 22 04:23:09 crc kubenswrapper[4699]: I1122 04:23:09.306673 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-b4c496f69-j5w8t" event={"ID":"1cf7d81b-c0df-48d7-9b01-b7185a803ac6","Type":"ContainerStarted","Data":"1f0bb24ebdf65457eb10e7d9a503c2465c516f3e6a4dd3cb1a55b75d35a08bbf"} Nov 22 04:23:09 crc kubenswrapper[4699]: I1122 04:23:09.306723 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-b4c496f69-j5w8t" event={"ID":"1cf7d81b-c0df-48d7-9b01-b7185a803ac6","Type":"ContainerStarted","Data":"d996dbb8818533c43896906ecf92cae8de4590e57beb0f5f4bd5b7242cba7045"} Nov 22 04:23:09 crc kubenswrapper[4699]: I1122 04:23:09.309260 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-7rck7" event={"ID":"36905f20-0246-46f5-921a-2d18b2db8bdd","Type":"ContainerStarted","Data":"395e1d49abc70a231d8634ad492b246b9743600fd395c307a7dc647ccafdcd7a"} Nov 22 04:23:09 crc kubenswrapper[4699]: I1122 04:23:09.309355 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-7rck7" event={"ID":"36905f20-0246-46f5-921a-2d18b2db8bdd","Type":"ContainerStarted","Data":"9326a26b00f9ab00e0da1e4b7008ff57dfe12c8358009f307eba5dba6f8ece59"} Nov 22 04:23:09 crc kubenswrapper[4699]: E1122 04:23:09.320306 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-zd2r8" podUID="aca6ad44-aa04-4178-ab59-bfdec68e49e7" Nov 22 04:23:09 crc kubenswrapper[4699]: E1122 04:23:09.329776 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/manila-operator-controller-manager-58f887965d-qsszz" podUID="072faef9-c4a0-4bf9-84a8-fadca8945449" Nov 22 04:23:09 crc kubenswrapper[4699]: E1122 04:23:09.330475 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:442c269d79163f8da75505019c02e9f0815837aaadcaddacb8e6c12df297ca13\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-7rck7" podUID="36905f20-0246-46f5-921a-2d18b2db8bdd" Nov 22 04:23:09 crc kubenswrapper[4699]: E1122 04:23:09.330632 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:82207e753574d4be246f86c4b074500d66cf20214aa80f0a8525cf3287a35e6d\\\"\"" pod="openstack-operators/test-operator-controller-manager-b4c496f69-j5w8t" podUID="1cf7d81b-c0df-48d7-9b01-b7185a803ac6" Nov 22 04:23:09 crc kubenswrapper[4699]: I1122 04:23:09.345038 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-kd4w8" event={"ID":"34a9f105-8024-4cc0-9ad2-14029731110d","Type":"ContainerStarted","Data":"0315d3268d39a84b16b6afbbdf42ebed33a877e8fd8adee3eb80c7c6627a60ed"} Nov 22 04:23:09 crc kubenswrapper[4699]: I1122 04:23:09.357736 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-lxncl" event={"ID":"0dc61afc-07cb-46af-afb8-4c0bf3bc84f0","Type":"ContainerStarted","Data":"3b206c561da344f9e53d37ee2504355e6ca5b277b9cb7ded501b68e62cc2a525"} Nov 22 04:23:09 crc kubenswrapper[4699]: I1122 04:23:09.377680 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-58f887965d-qsszz" event={"ID":"072faef9-c4a0-4bf9-84a8-fadca8945449","Type":"ContainerStarted","Data":"e88ba22c2f550a58f4dd73cc290fb821b1093a03703567610a67fdb1ac1c707a"} Nov 22 04:23:09 crc kubenswrapper[4699]: E1122 04:23:09.379559 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:b749a5dd8bc718875c3f5e81b38d54d003be77ab92de4a3e9f9595566496a58a\\\"\"" pod="openstack-operators/manila-operator-controller-manager-58f887965d-qsszz" podUID="072faef9-c4a0-4bf9-84a8-fadca8945449" Nov 22 04:23:09 crc kubenswrapper[4699]: I1122 04:23:09.382310 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-fpcpg" event={"ID":"0669449f-4e6b-4ab7-90ee-f8d93286db7a","Type":"ContainerStarted","Data":"c8d9b3558f329575755e37695e0f7a209bb36d0d7415405014b445286d6022c6"} Nov 22 04:23:09 crc kubenswrapper[4699]: I1122 04:23:09.386720 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-qcxmj" event={"ID":"913c840a-25a6-46f9-bd06-e379438a5292","Type":"ContainerStarted","Data":"9d3b18d547acbc060eb3f7d527d28244a8fec8d270ecbbb3040e1e0c2a1c074e"} Nov 22 04:23:09 crc kubenswrapper[4699]: I1122 04:23:09.388333 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7454b96578-2dh9g" event={"ID":"8ea3fa32-8451-4f8a-b395-98ce1382e116","Type":"ContainerStarted","Data":"13046eff5afc3b0bfad403220165afb960cddffee3f7d666c4fa7a1a6aebbad8"} Nov 22 04:23:09 crc kubenswrapper[4699]: I1122 04:23:09.389829 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7969689c84-thd4s" event={"ID":"18c1e29a-63b8-4973-92a9-87c5b0301565","Type":"ContainerStarted","Data":"5fc6dd02ab5b680997d79880b7a1c05683c342bcc91488922a7a4e2cbd88668e"} Nov 22 04:23:09 crc kubenswrapper[4699]: I1122 04:23:09.391151 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-j5b4z" event={"ID":"7199810d-9e13-4ed5-a4bc-46c874551678","Type":"ContainerStarted","Data":"2641c718db5ac2a8a0fbfd960d0d906a85a8eddf0b878a8ccbecef62c7d45ae4"} Nov 22 04:23:09 crc kubenswrapper[4699]: I1122 04:23:09.395210 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-66b5f67bb4-9h4ls" event={"ID":"07e128b3-5973-437b-b7ec-80177dacf14f","Type":"ContainerStarted","Data":"d33459d5b6a1ab132552dade38896998c646bdf3cfacba949e2dc0e925ad940b"} Nov 22 04:23:09 crc kubenswrapper[4699]: I1122 04:23:09.403519 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7875d8bb94-q9xzz" event={"ID":"c6fc85a2-ca9a-4d87-811a-4ff8fcdcf477","Type":"ContainerStarted","Data":"01e97ff05435d2802503165c945112d54d45616da1d09f3f3e4a07798feb8310"} Nov 22 04:23:09 crc kubenswrapper[4699]: I1122 04:23:09.404834 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5d95d484b9-g8rz2" event={"ID":"aaa73391-c097-4428-a43d-a5a4c1469419","Type":"ContainerStarted","Data":"d01df8a8ceb54cfff5fd0f27343d89b6a36460d1101738f6a9a865ba3f60f295"} Nov 22 04:23:10 crc kubenswrapper[4699]: I1122 04:23:10.419778 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-d656998f4-fcjt6" event={"ID":"a67d0761-3d62-4e25-80bc-cf6fac86cf0b","Type":"ContainerStarted","Data":"b011b4153e52a110b7d058a8b79fa31ac56fe2d265c5dd49d24854429f93792c"} Nov 22 04:23:10 crc kubenswrapper[4699]: I1122 04:23:10.423191 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-58f887965d-qsszz" event={"ID":"072faef9-c4a0-4bf9-84a8-fadca8945449","Type":"ContainerStarted","Data":"b4a2e781bd92975d103e1681239621ddc429bee181680365c10ceda33843c195"} Nov 22 04:23:10 crc kubenswrapper[4699]: E1122 04:23:10.424242 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:c0b5f124a37c1538042c0e63f0978429572e2a851d7f3a6eb80de09b86d755a0\\\"\"" pod="openstack-operators/swift-operator-controller-manager-d656998f4-fcjt6" podUID="a67d0761-3d62-4e25-80bc-cf6fac86cf0b" Nov 22 04:23:10 crc kubenswrapper[4699]: E1122 04:23:10.439751 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:b749a5dd8bc718875c3f5e81b38d54d003be77ab92de4a3e9f9595566496a58a\\\"\"" pod="openstack-operators/manila-operator-controller-manager-58f887965d-qsszz" podUID="072faef9-c4a0-4bf9-84a8-fadca8945449" Nov 22 04:23:10 crc kubenswrapper[4699]: I1122 04:23:10.449856 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-66b5f67bb4-9h4ls" event={"ID":"07e128b3-5973-437b-b7ec-80177dacf14f","Type":"ContainerStarted","Data":"8f5b464f9048b1f8f329a7afe8dd233b0faf9b54d7c6f9922558609c79113357"} Nov 22 04:23:10 crc kubenswrapper[4699]: I1122 04:23:10.449896 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-66b5f67bb4-9h4ls" event={"ID":"07e128b3-5973-437b-b7ec-80177dacf14f","Type":"ContainerStarted","Data":"477c75fdffa3b752b2ff2da28d6dbdcc9594f49b0c43b8226d2583c2a0515d43"} Nov 22 04:23:10 crc kubenswrapper[4699]: I1122 04:23:10.450555 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-66b5f67bb4-9h4ls" Nov 22 04:23:10 crc kubenswrapper[4699]: I1122 04:23:10.470111 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-zd2r8" event={"ID":"aca6ad44-aa04-4178-ab59-bfdec68e49e7","Type":"ContainerStarted","Data":"9d675109d57b7d3c35d347b016114795a9a2918d18355c5041f013439cfa186f"} Nov 22 04:23:10 crc kubenswrapper[4699]: E1122 04:23:10.474918 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:c053e34316044f14929e16e4f0d97f9f1b24cb68b5e22b925ca74c66aaaed0a7\\\"\"" pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-qlcpr" podUID="e8f34ea0-681d-4a19-b9c9-0c230a7261e3" Nov 22 04:23:10 crc kubenswrapper[4699]: E1122 04:23:10.478423 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rhx7c" podUID="4fb724ba-7502-41eb-aab0-40eacbcd652e" Nov 22 04:23:10 crc kubenswrapper[4699]: E1122 04:23:10.480030 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:442c269d79163f8da75505019c02e9f0815837aaadcaddacb8e6c12df297ca13\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-7rck7" podUID="36905f20-0246-46f5-921a-2d18b2db8bdd" Nov 22 04:23:10 crc kubenswrapper[4699]: E1122 04:23:10.480485 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:7b90521b9e9cb4eb43c2f1c3bf85dbd068d684315f4f705b07708dd078df9d04\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-zd2r8" podUID="aca6ad44-aa04-4178-ab59-bfdec68e49e7" Nov 22 04:23:10 crc kubenswrapper[4699]: E1122 04:23:10.481191 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:82207e753574d4be246f86c4b074500d66cf20214aa80f0a8525cf3287a35e6d\\\"\"" pod="openstack-operators/test-operator-controller-manager-b4c496f69-j5w8t" podUID="1cf7d81b-c0df-48d7-9b01-b7185a803ac6" Nov 22 04:23:10 crc kubenswrapper[4699]: I1122 04:23:10.484982 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-66b5f67bb4-9h4ls" podStartSLOduration=3.484971299 podStartE2EDuration="3.484971299s" podCreationTimestamp="2025-11-22 04:23:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:23:10.483287048 +0000 UTC m=+941.825908245" watchObservedRunningTime="2025-11-22 04:23:10.484971299 +0000 UTC m=+941.827592486" Nov 22 04:23:11 crc kubenswrapper[4699]: E1122 04:23:11.479891 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:7b90521b9e9cb4eb43c2f1c3bf85dbd068d684315f4f705b07708dd078df9d04\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-zd2r8" podUID="aca6ad44-aa04-4178-ab59-bfdec68e49e7" Nov 22 04:23:11 crc kubenswrapper[4699]: E1122 04:23:11.479956 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:c0b5f124a37c1538042c0e63f0978429572e2a851d7f3a6eb80de09b86d755a0\\\"\"" pod="openstack-operators/swift-operator-controller-manager-d656998f4-fcjt6" podUID="a67d0761-3d62-4e25-80bc-cf6fac86cf0b" Nov 22 04:23:11 crc kubenswrapper[4699]: E1122 04:23:11.479940 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:b749a5dd8bc718875c3f5e81b38d54d003be77ab92de4a3e9f9595566496a58a\\\"\"" pod="openstack-operators/manila-operator-controller-manager-58f887965d-qsszz" podUID="072faef9-c4a0-4bf9-84a8-fadca8945449" Nov 22 04:23:18 crc kubenswrapper[4699]: I1122 04:23:18.483398 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-66b5f67bb4-9h4ls" Nov 22 04:23:20 crc kubenswrapper[4699]: I1122 04:23:20.598352 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7969689c84-thd4s" event={"ID":"18c1e29a-63b8-4973-92a9-87c5b0301565","Type":"ContainerStarted","Data":"4320a512c4499190058fbc916198e084903be0a5f75643a338bd2d60157dd85e"} Nov 22 04:23:20 crc kubenswrapper[4699]: I1122 04:23:20.660177 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-kd4w8" event={"ID":"34a9f105-8024-4cc0-9ad2-14029731110d","Type":"ContainerStarted","Data":"af165de563867fc06b04bc677b92bf2e45e4a3e7bc82577a62da606c25b7aaad"} Nov 22 04:23:20 crc kubenswrapper[4699]: I1122 04:23:20.681585 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5d95d484b9-g8rz2" event={"ID":"aaa73391-c097-4428-a43d-a5a4c1469419","Type":"ContainerStarted","Data":"9b592db24073d629c3a5d513018c0cffbd49be8885ce61772d1c1cea17f4c8a4"} Nov 22 04:23:20 crc kubenswrapper[4699]: I1122 04:23:20.682998 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6498cbf48f-pq7wj" event={"ID":"455f990d-3a21-4c84-8a9d-e4a4af10c47f","Type":"ContainerStarted","Data":"d98a9a752696747ee4301111e6fce0776e2fc25b3593b4138975a01cf697cf4c"} Nov 22 04:23:20 crc kubenswrapper[4699]: I1122 04:23:20.692804 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7454b96578-2dh9g" event={"ID":"8ea3fa32-8451-4f8a-b395-98ce1382e116","Type":"ContainerStarted","Data":"e2519db8444daff5cda0b0e842e31b6f6056ffd863cad7a678b4e3a45a92c960"} Nov 22 04:23:20 crc kubenswrapper[4699]: I1122 04:23:20.695123 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-qcvls" event={"ID":"5cda144e-7465-4060-945a-89e3d288c551","Type":"ContainerStarted","Data":"0beb1ebad4c9b72aadedca24f1948aab0f754c4a73b33bafc2a7e3db78b5453b"} Nov 22 04:23:20 crc kubenswrapper[4699]: I1122 04:23:20.695521 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-qcvls" Nov 22 04:23:20 crc kubenswrapper[4699]: I1122 04:23:20.704555 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-t4rhb" event={"ID":"482b57cb-741a-4062-9479-2a41febc67af","Type":"ContainerStarted","Data":"99e61904f4cbc4cd25c0cc6dd020a91654d0a72213f81da3886fe87c0ace0ddf"} Nov 22 04:23:20 crc kubenswrapper[4699]: I1122 04:23:20.729763 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-lxncl" event={"ID":"0dc61afc-07cb-46af-afb8-4c0bf3bc84f0","Type":"ContainerStarted","Data":"145a1985ab976aaacc8cb1933f44dd85c65dc9dbd30d455c4fb78130a844a5d9"} Nov 22 04:23:20 crc kubenswrapper[4699]: I1122 04:23:20.755160 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-qcvls" podStartSLOduration=3.02876355 podStartE2EDuration="14.755142835s" podCreationTimestamp="2025-11-22 04:23:06 +0000 UTC" firstStartedPulling="2025-11-22 04:23:07.886639098 +0000 UTC m=+939.229260285" lastFinishedPulling="2025-11-22 04:23:19.613018383 +0000 UTC m=+950.955639570" observedRunningTime="2025-11-22 04:23:20.748557856 +0000 UTC m=+952.091179053" watchObservedRunningTime="2025-11-22 04:23:20.755142835 +0000 UTC m=+952.097764022" Nov 22 04:23:21 crc kubenswrapper[4699]: I1122 04:23:21.737894 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-qcxmj" event={"ID":"913c840a-25a6-46f9-bd06-e379438a5292","Type":"ContainerStarted","Data":"ec85e3b0269c4908c6c4de6f62ac4f5fa8b9589cac2c4ad4d19c4177604bd767"} Nov 22 04:23:21 crc kubenswrapper[4699]: I1122 04:23:21.738279 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-qcxmj" event={"ID":"913c840a-25a6-46f9-bd06-e379438a5292","Type":"ContainerStarted","Data":"d5e298df08f0eb6412785ba99566c3fea304f506f721a50ef5928f505777e808"} Nov 22 04:23:21 crc kubenswrapper[4699]: I1122 04:23:21.738353 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-qcxmj" Nov 22 04:23:21 crc kubenswrapper[4699]: I1122 04:23:21.740518 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-t4rhb" event={"ID":"482b57cb-741a-4062-9479-2a41febc67af","Type":"ContainerStarted","Data":"0f2fe3ae936ac424f38efafa0b7883deb2473fdd8f7bc6e9ccdd174ca4768fe3"} Nov 22 04:23:21 crc kubenswrapper[4699]: I1122 04:23:21.740781 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-t4rhb" Nov 22 04:23:21 crc kubenswrapper[4699]: I1122 04:23:21.743385 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-fpcpg" event={"ID":"0669449f-4e6b-4ab7-90ee-f8d93286db7a","Type":"ContainerStarted","Data":"67c50585fb3313c92c12abacfc932bbfe89698ee7a6be80c63d9f9a28419d4bf"} Nov 22 04:23:21 crc kubenswrapper[4699]: I1122 04:23:21.743426 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-fpcpg" event={"ID":"0669449f-4e6b-4ab7-90ee-f8d93286db7a","Type":"ContainerStarted","Data":"839126c14540d4ac8742c45da4e7293e03a35e46f20c89a3ebb95834dd3d955b"} Nov 22 04:23:21 crc kubenswrapper[4699]: I1122 04:23:21.743928 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-fpcpg" Nov 22 04:23:21 crc kubenswrapper[4699]: I1122 04:23:21.746372 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6498cbf48f-pq7wj" event={"ID":"455f990d-3a21-4c84-8a9d-e4a4af10c47f","Type":"ContainerStarted","Data":"6b4051f51890da810f2abba79bfd5a04748210e8cf9be22977f00b42cbc974f1"} Nov 22 04:23:21 crc kubenswrapper[4699]: I1122 04:23:21.747056 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-6498cbf48f-pq7wj" Nov 22 04:23:21 crc kubenswrapper[4699]: I1122 04:23:21.748933 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-qcvls" event={"ID":"5cda144e-7465-4060-945a-89e3d288c551","Type":"ContainerStarted","Data":"e982879ef7ccd803052ac60b94cadcdb1f47614475e305ed0d567474a6e4c1ed"} Nov 22 04:23:21 crc kubenswrapper[4699]: I1122 04:23:21.750957 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-6fd98" event={"ID":"b4a26451-a994-4295-b354-46babc06a258","Type":"ContainerStarted","Data":"b38af614adb2c12753f88308886fd7919dd4a4f2bc48ff49f483c76e3b5bf6df"} Nov 22 04:23:21 crc kubenswrapper[4699]: I1122 04:23:21.751000 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-6fd98" event={"ID":"b4a26451-a994-4295-b354-46babc06a258","Type":"ContainerStarted","Data":"2c1b2b3972faa9d14995541c8b4b4d2eba9fcc5de58d3b39aca6f228cd72c6a8"} Nov 22 04:23:21 crc kubenswrapper[4699]: I1122 04:23:21.751105 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-6fd98" Nov 22 04:23:21 crc kubenswrapper[4699]: I1122 04:23:21.753301 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-56f54d6746-k9wzx" event={"ID":"d0b96c5e-2c37-4f90-b933-2b8f8cbdfaf3","Type":"ContainerStarted","Data":"ee3e73d3bdb38c92f573e870a7f423c64d9735a2319915edb154c05ac6534d11"} Nov 22 04:23:21 crc kubenswrapper[4699]: I1122 04:23:21.753340 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-56f54d6746-k9wzx" event={"ID":"d0b96c5e-2c37-4f90-b933-2b8f8cbdfaf3","Type":"ContainerStarted","Data":"c74559fe97a319f77a423dcc534de83bd90c3edb214930127a172999b12dcdd6"} Nov 22 04:23:21 crc kubenswrapper[4699]: I1122 04:23:21.753458 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-56f54d6746-k9wzx" Nov 22 04:23:21 crc kubenswrapper[4699]: I1122 04:23:21.755152 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-v289m" event={"ID":"91f34dc0-cd1e-4f25-90e8-cbcb4ae2d930","Type":"ContainerStarted","Data":"f72af305548044a8fecef11076f491cee6d9487e31a586878e2fafc221c4938d"} Nov 22 04:23:21 crc kubenswrapper[4699]: I1122 04:23:21.755196 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-v289m" event={"ID":"91f34dc0-cd1e-4f25-90e8-cbcb4ae2d930","Type":"ContainerStarted","Data":"3f2c5d314be02a83c8051b97c189b12a85605c3edec4be524f79d081a91f01d4"} Nov 22 04:23:21 crc kubenswrapper[4699]: I1122 04:23:21.755748 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-v289m" Nov 22 04:23:21 crc kubenswrapper[4699]: I1122 04:23:21.757862 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7875d8bb94-q9xzz" event={"ID":"c6fc85a2-ca9a-4d87-811a-4ff8fcdcf477","Type":"ContainerStarted","Data":"a64008177c12acbee8e12e33eb2542cf6cbfce25c2ead58352f655bfe41fe729"} Nov 22 04:23:21 crc kubenswrapper[4699]: I1122 04:23:21.757886 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7875d8bb94-q9xzz" event={"ID":"c6fc85a2-ca9a-4d87-811a-4ff8fcdcf477","Type":"ContainerStarted","Data":"d0df44de7258842902c0dd4e1f7c8437a848ac3a62dd35ea56f9e2bfb684351c"} Nov 22 04:23:21 crc kubenswrapper[4699]: I1122 04:23:21.758253 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7875d8bb94-q9xzz" Nov 22 04:23:21 crc kubenswrapper[4699]: I1122 04:23:21.759400 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-kd4w8" event={"ID":"34a9f105-8024-4cc0-9ad2-14029731110d","Type":"ContainerStarted","Data":"d36d40fc32e16238a4ced9462717b5f8d3116e08264b81a9db5ac7a40408645f"} Nov 22 04:23:21 crc kubenswrapper[4699]: I1122 04:23:21.759762 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-kd4w8" Nov 22 04:23:21 crc kubenswrapper[4699]: I1122 04:23:21.762012 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5d95d484b9-g8rz2" event={"ID":"aaa73391-c097-4428-a43d-a5a4c1469419","Type":"ContainerStarted","Data":"844800c933f290cbe16317c25c12583af49a0915681109692e0cb9de02c812e7"} Nov 22 04:23:21 crc kubenswrapper[4699]: I1122 04:23:21.762237 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5d95d484b9-g8rz2" Nov 22 04:23:21 crc kubenswrapper[4699]: I1122 04:23:21.763745 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-lxncl" event={"ID":"0dc61afc-07cb-46af-afb8-4c0bf3bc84f0","Type":"ContainerStarted","Data":"f3a1be65339e9c28452b817be0fc17b1fc201953f821d8994761f215075f82c9"} Nov 22 04:23:21 crc kubenswrapper[4699]: I1122 04:23:21.763976 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-lxncl" Nov 22 04:23:21 crc kubenswrapper[4699]: I1122 04:23:21.765539 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7454b96578-2dh9g" event={"ID":"8ea3fa32-8451-4f8a-b395-98ce1382e116","Type":"ContainerStarted","Data":"9aa77d7cc8ca03ebe5eac3b01f2c0c7486e134c544fb76f4d7183255d6a997ef"} Nov 22 04:23:21 crc kubenswrapper[4699]: I1122 04:23:21.766043 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7454b96578-2dh9g" Nov 22 04:23:21 crc kubenswrapper[4699]: I1122 04:23:21.776020 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-qcxmj" podStartSLOduration=5.097516892 podStartE2EDuration="15.775998095s" podCreationTimestamp="2025-11-22 04:23:06 +0000 UTC" firstStartedPulling="2025-11-22 04:23:08.936446477 +0000 UTC m=+940.279067664" lastFinishedPulling="2025-11-22 04:23:19.61492768 +0000 UTC m=+950.957548867" observedRunningTime="2025-11-22 04:23:21.766998558 +0000 UTC m=+953.109619745" watchObservedRunningTime="2025-11-22 04:23:21.775998095 +0000 UTC m=+953.118619282" Nov 22 04:23:21 crc kubenswrapper[4699]: I1122 04:23:21.779684 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7969689c84-thd4s" event={"ID":"18c1e29a-63b8-4973-92a9-87c5b0301565","Type":"ContainerStarted","Data":"01bb1dd4732434c3dad905a4f91f24413c713eef62cb47486eaf250315bc3820"} Nov 22 04:23:21 crc kubenswrapper[4699]: I1122 04:23:21.780467 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-7969689c84-thd4s" Nov 22 04:23:21 crc kubenswrapper[4699]: I1122 04:23:21.785916 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-j5b4z" event={"ID":"7199810d-9e13-4ed5-a4bc-46c874551678","Type":"ContainerStarted","Data":"f431efc1a7f7719ed2c9637398d9bd84c5755c3cd208cc49024c1735ad6925c8"} Nov 22 04:23:21 crc kubenswrapper[4699]: I1122 04:23:21.785945 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-j5b4z" event={"ID":"7199810d-9e13-4ed5-a4bc-46c874551678","Type":"ContainerStarted","Data":"b76e04bd3ea593139e61c57f6f280db0515be131f339cee271032e66dd05e9d4"} Nov 22 04:23:21 crc kubenswrapper[4699]: I1122 04:23:21.786369 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-j5b4z" Nov 22 04:23:21 crc kubenswrapper[4699]: I1122 04:23:21.796056 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-v289m" podStartSLOduration=4.8967297290000005 podStartE2EDuration="15.79603859s" podCreationTimestamp="2025-11-22 04:23:06 +0000 UTC" firstStartedPulling="2025-11-22 04:23:08.744968647 +0000 UTC m=+940.087589834" lastFinishedPulling="2025-11-22 04:23:19.644277508 +0000 UTC m=+950.986898695" observedRunningTime="2025-11-22 04:23:21.79024325 +0000 UTC m=+953.132864437" watchObservedRunningTime="2025-11-22 04:23:21.79603859 +0000 UTC m=+953.138659777" Nov 22 04:23:21 crc kubenswrapper[4699]: I1122 04:23:21.817991 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7875d8bb94-q9xzz" podStartSLOduration=4.449268146 podStartE2EDuration="15.81797325s" podCreationTimestamp="2025-11-22 04:23:06 +0000 UTC" firstStartedPulling="2025-11-22 04:23:08.243776126 +0000 UTC m=+939.586397313" lastFinishedPulling="2025-11-22 04:23:19.61248123 +0000 UTC m=+950.955102417" observedRunningTime="2025-11-22 04:23:21.814822294 +0000 UTC m=+953.157443491" watchObservedRunningTime="2025-11-22 04:23:21.81797325 +0000 UTC m=+953.160594437" Nov 22 04:23:21 crc kubenswrapper[4699]: I1122 04:23:21.842385 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-lxncl" podStartSLOduration=4.931848377 podStartE2EDuration="15.84236376s" podCreationTimestamp="2025-11-22 04:23:06 +0000 UTC" firstStartedPulling="2025-11-22 04:23:08.702527841 +0000 UTC m=+940.045149028" lastFinishedPulling="2025-11-22 04:23:19.613043224 +0000 UTC m=+950.955664411" observedRunningTime="2025-11-22 04:23:21.838815994 +0000 UTC m=+953.181437191" watchObservedRunningTime="2025-11-22 04:23:21.84236376 +0000 UTC m=+953.184984947" Nov 22 04:23:21 crc kubenswrapper[4699]: I1122 04:23:21.861775 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7454b96578-2dh9g" podStartSLOduration=4.991089669 podStartE2EDuration="15.861758209s" podCreationTimestamp="2025-11-22 04:23:06 +0000 UTC" firstStartedPulling="2025-11-22 04:23:08.740995851 +0000 UTC m=+940.083617038" lastFinishedPulling="2025-11-22 04:23:19.611664391 +0000 UTC m=+950.954285578" observedRunningTime="2025-11-22 04:23:21.857589109 +0000 UTC m=+953.200210316" watchObservedRunningTime="2025-11-22 04:23:21.861758209 +0000 UTC m=+953.204379396" Nov 22 04:23:21 crc kubenswrapper[4699]: I1122 04:23:21.876965 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-56f54d6746-k9wzx" podStartSLOduration=4.517224929 podStartE2EDuration="15.876946337s" podCreationTimestamp="2025-11-22 04:23:06 +0000 UTC" firstStartedPulling="2025-11-22 04:23:08.253979332 +0000 UTC m=+939.596600519" lastFinishedPulling="2025-11-22 04:23:19.61370073 +0000 UTC m=+950.956321927" observedRunningTime="2025-11-22 04:23:21.874761524 +0000 UTC m=+953.217382711" watchObservedRunningTime="2025-11-22 04:23:21.876946337 +0000 UTC m=+953.219567524" Nov 22 04:23:21 crc kubenswrapper[4699]: I1122 04:23:21.903174 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-fpcpg" podStartSLOduration=5.005768496 podStartE2EDuration="15.903158171s" podCreationTimestamp="2025-11-22 04:23:06 +0000 UTC" firstStartedPulling="2025-11-22 04:23:08.74589981 +0000 UTC m=+940.088520997" lastFinishedPulling="2025-11-22 04:23:19.643289485 +0000 UTC m=+950.985910672" observedRunningTime="2025-11-22 04:23:21.898853827 +0000 UTC m=+953.241475024" watchObservedRunningTime="2025-11-22 04:23:21.903158171 +0000 UTC m=+953.245779358" Nov 22 04:23:21 crc kubenswrapper[4699]: I1122 04:23:21.948006 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-kd4w8" podStartSLOduration=4.585613313 podStartE2EDuration="15.947984525s" podCreationTimestamp="2025-11-22 04:23:06 +0000 UTC" firstStartedPulling="2025-11-22 04:23:08.250654422 +0000 UTC m=+939.593275609" lastFinishedPulling="2025-11-22 04:23:19.613025624 +0000 UTC m=+950.955646821" observedRunningTime="2025-11-22 04:23:21.932872699 +0000 UTC m=+953.275493886" watchObservedRunningTime="2025-11-22 04:23:21.947984525 +0000 UTC m=+953.290605712" Nov 22 04:23:21 crc kubenswrapper[4699]: I1122 04:23:21.965791 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-6498cbf48f-pq7wj" podStartSLOduration=4.097148689 podStartE2EDuration="15.965763275s" podCreationTimestamp="2025-11-22 04:23:06 +0000 UTC" firstStartedPulling="2025-11-22 04:23:07.719311081 +0000 UTC m=+939.061932268" lastFinishedPulling="2025-11-22 04:23:19.587925667 +0000 UTC m=+950.930546854" observedRunningTime="2025-11-22 04:23:21.957871804 +0000 UTC m=+953.300492981" watchObservedRunningTime="2025-11-22 04:23:21.965763275 +0000 UTC m=+953.308384472" Nov 22 04:23:21 crc kubenswrapper[4699]: I1122 04:23:21.978130 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-5d95d484b9-g8rz2" podStartSLOduration=5.072137151 podStartE2EDuration="15.978114924s" podCreationTimestamp="2025-11-22 04:23:06 +0000 UTC" firstStartedPulling="2025-11-22 04:23:08.740674063 +0000 UTC m=+940.083295250" lastFinishedPulling="2025-11-22 04:23:19.646651846 +0000 UTC m=+950.989273023" observedRunningTime="2025-11-22 04:23:21.974930196 +0000 UTC m=+953.317551383" watchObservedRunningTime="2025-11-22 04:23:21.978114924 +0000 UTC m=+953.320736111" Nov 22 04:23:22 crc kubenswrapper[4699]: I1122 04:23:22.030962 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-6fd98" podStartSLOduration=4.617017593 podStartE2EDuration="16.030946641s" podCreationTimestamp="2025-11-22 04:23:06 +0000 UTC" firstStartedPulling="2025-11-22 04:23:08.232882382 +0000 UTC m=+939.575503569" lastFinishedPulling="2025-11-22 04:23:19.64681143 +0000 UTC m=+950.989432617" observedRunningTime="2025-11-22 04:23:22.030769317 +0000 UTC m=+953.373390514" watchObservedRunningTime="2025-11-22 04:23:22.030946641 +0000 UTC m=+953.373567828" Nov 22 04:23:22 crc kubenswrapper[4699]: I1122 04:23:22.035349 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-t4rhb" podStartSLOduration=4.076549197 podStartE2EDuration="15.035337877s" podCreationTimestamp="2025-11-22 04:23:07 +0000 UTC" firstStartedPulling="2025-11-22 04:23:08.746010952 +0000 UTC m=+940.088632139" lastFinishedPulling="2025-11-22 04:23:19.704799622 +0000 UTC m=+951.047420819" observedRunningTime="2025-11-22 04:23:22.0073097 +0000 UTC m=+953.349930887" watchObservedRunningTime="2025-11-22 04:23:22.035337877 +0000 UTC m=+953.377959064" Nov 22 04:23:22 crc kubenswrapper[4699]: I1122 04:23:22.047033 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-j5b4z" podStartSLOduration=5.10316454 podStartE2EDuration="16.04701881s" podCreationTimestamp="2025-11-22 04:23:06 +0000 UTC" firstStartedPulling="2025-11-22 04:23:08.668656041 +0000 UTC m=+940.011277238" lastFinishedPulling="2025-11-22 04:23:19.612510321 +0000 UTC m=+950.955131508" observedRunningTime="2025-11-22 04:23:22.043852873 +0000 UTC m=+953.386474060" watchObservedRunningTime="2025-11-22 04:23:22.04701881 +0000 UTC m=+953.389639997" Nov 22 04:23:23 crc kubenswrapper[4699]: I1122 04:23:23.802243 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-b4c496f69-j5w8t" event={"ID":"1cf7d81b-c0df-48d7-9b01-b7185a803ac6","Type":"ContainerStarted","Data":"f8a23633b58a71f860b590009dec75147a6dea92ccf34b75c53cf2a986777def"} Nov 22 04:23:23 crc kubenswrapper[4699]: I1122 04:23:23.822424 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-7969689c84-thd4s" podStartSLOduration=6.4320382 podStartE2EDuration="17.822407448s" podCreationTimestamp="2025-11-22 04:23:06 +0000 UTC" firstStartedPulling="2025-11-22 04:23:08.221289762 +0000 UTC m=+939.563910949" lastFinishedPulling="2025-11-22 04:23:19.61165901 +0000 UTC m=+950.954280197" observedRunningTime="2025-11-22 04:23:22.059930022 +0000 UTC m=+953.402551219" watchObservedRunningTime="2025-11-22 04:23:23.822407448 +0000 UTC m=+955.165028635" Nov 22 04:23:23 crc kubenswrapper[4699]: I1122 04:23:23.822904 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-b4c496f69-j5w8t" podStartSLOduration=2.512905179 podStartE2EDuration="16.82289903s" podCreationTimestamp="2025-11-22 04:23:07 +0000 UTC" firstStartedPulling="2025-11-22 04:23:08.755673006 +0000 UTC m=+940.098294203" lastFinishedPulling="2025-11-22 04:23:23.065666867 +0000 UTC m=+954.408288054" observedRunningTime="2025-11-22 04:23:23.816275429 +0000 UTC m=+955.158896636" watchObservedRunningTime="2025-11-22 04:23:23.82289903 +0000 UTC m=+955.165520217" Nov 22 04:23:24 crc kubenswrapper[4699]: I1122 04:23:24.814248 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-qlcpr" event={"ID":"e8f34ea0-681d-4a19-b9c9-0c230a7261e3","Type":"ContainerStarted","Data":"71aa8d038748cc01a7e38ab393d7d2e6984399586ee17c29dd86af1206df7d97"} Nov 22 04:23:24 crc kubenswrapper[4699]: I1122 04:23:24.814521 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-qlcpr" Nov 22 04:23:24 crc kubenswrapper[4699]: I1122 04:23:24.834455 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-qlcpr" podStartSLOduration=3.4253947240000002 podStartE2EDuration="18.834412994s" podCreationTimestamp="2025-11-22 04:23:06 +0000 UTC" firstStartedPulling="2025-11-22 04:23:08.761307572 +0000 UTC m=+940.103928759" lastFinishedPulling="2025-11-22 04:23:24.170325842 +0000 UTC m=+955.512947029" observedRunningTime="2025-11-22 04:23:24.8305402 +0000 UTC m=+956.173161407" watchObservedRunningTime="2025-11-22 04:23:24.834412994 +0000 UTC m=+956.177034171" Nov 22 04:23:25 crc kubenswrapper[4699]: I1122 04:23:25.449858 4699 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 04:23:26 crc kubenswrapper[4699]: I1122 04:23:26.884658 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-qcvls" Nov 22 04:23:26 crc kubenswrapper[4699]: I1122 04:23:26.906303 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-6498cbf48f-pq7wj" Nov 22 04:23:26 crc kubenswrapper[4699]: I1122 04:23:26.935506 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-6fd98" Nov 22 04:23:26 crc kubenswrapper[4699]: I1122 04:23:26.946191 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-7969689c84-thd4s" Nov 22 04:23:26 crc kubenswrapper[4699]: I1122 04:23:26.975708 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-56f54d6746-k9wzx" Nov 22 04:23:27 crc kubenswrapper[4699]: I1122 04:23:27.039256 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-kd4w8" Nov 22 04:23:27 crc kubenswrapper[4699]: I1122 04:23:27.068620 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7875d8bb94-q9xzz" Nov 22 04:23:27 crc kubenswrapper[4699]: I1122 04:23:27.149296 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5d95d484b9-g8rz2" Nov 22 04:23:27 crc kubenswrapper[4699]: I1122 04:23:27.221603 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7454b96578-2dh9g" Nov 22 04:23:27 crc kubenswrapper[4699]: I1122 04:23:27.314581 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-v289m" Nov 22 04:23:27 crc kubenswrapper[4699]: I1122 04:23:27.403542 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-j5b4z" Nov 22 04:23:27 crc kubenswrapper[4699]: I1122 04:23:27.487484 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-fpcpg" Nov 22 04:23:27 crc kubenswrapper[4699]: I1122 04:23:27.538450 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-lxncl" Nov 22 04:23:27 crc kubenswrapper[4699]: I1122 04:23:27.741015 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-b4c496f69-j5w8t" Nov 22 04:23:27 crc kubenswrapper[4699]: I1122 04:23:27.780394 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-t4rhb" Nov 22 04:23:27 crc kubenswrapper[4699]: I1122 04:23:27.840870 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-d656998f4-fcjt6" event={"ID":"a67d0761-3d62-4e25-80bc-cf6fac86cf0b","Type":"ContainerStarted","Data":"15ca03d526b29299cc779cf3c4eca0f7678383331dd962c3ed959fd33917ca51"} Nov 22 04:23:27 crc kubenswrapper[4699]: I1122 04:23:27.841092 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-d656998f4-fcjt6" Nov 22 04:23:27 crc kubenswrapper[4699]: I1122 04:23:27.842798 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rhx7c" event={"ID":"4fb724ba-7502-41eb-aab0-40eacbcd652e","Type":"ContainerStarted","Data":"43547e9a623816e9a79df354ad34a5b28bfdae9635026b7b20448eac2e6c5602"} Nov 22 04:23:27 crc kubenswrapper[4699]: I1122 04:23:27.844518 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-7rck7" event={"ID":"36905f20-0246-46f5-921a-2d18b2db8bdd","Type":"ContainerStarted","Data":"7802445c349d885b79d1c4b30e5fcfed55c20591e2453189ad7b9735fc4e9295"} Nov 22 04:23:27 crc kubenswrapper[4699]: I1122 04:23:27.844704 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-7rck7" Nov 22 04:23:27 crc kubenswrapper[4699]: I1122 04:23:27.868887 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-d656998f4-fcjt6" podStartSLOduration=3.589083283 podStartE2EDuration="21.868867723s" podCreationTimestamp="2025-11-22 04:23:06 +0000 UTC" firstStartedPulling="2025-11-22 04:23:08.792463936 +0000 UTC m=+940.135085123" lastFinishedPulling="2025-11-22 04:23:27.072248376 +0000 UTC m=+958.414869563" observedRunningTime="2025-11-22 04:23:27.863761619 +0000 UTC m=+959.206382826" watchObservedRunningTime="2025-11-22 04:23:27.868867723 +0000 UTC m=+959.211488910" Nov 22 04:23:27 crc kubenswrapper[4699]: I1122 04:23:27.883453 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rhx7c" podStartSLOduration=2.502017717 podStartE2EDuration="20.883420555s" podCreationTimestamp="2025-11-22 04:23:07 +0000 UTC" firstStartedPulling="2025-11-22 04:23:08.748673377 +0000 UTC m=+940.091294564" lastFinishedPulling="2025-11-22 04:23:27.130076215 +0000 UTC m=+958.472697402" observedRunningTime="2025-11-22 04:23:27.87827032 +0000 UTC m=+959.220891507" watchObservedRunningTime="2025-11-22 04:23:27.883420555 +0000 UTC m=+959.226041742" Nov 22 04:23:27 crc kubenswrapper[4699]: I1122 04:23:27.898942 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-7rck7" podStartSLOduration=3.625241317 podStartE2EDuration="21.89892391s" podCreationTimestamp="2025-11-22 04:23:06 +0000 UTC" firstStartedPulling="2025-11-22 04:23:08.750791278 +0000 UTC m=+940.093412455" lastFinishedPulling="2025-11-22 04:23:27.024473861 +0000 UTC m=+958.367095048" observedRunningTime="2025-11-22 04:23:27.897948406 +0000 UTC m=+959.240569613" watchObservedRunningTime="2025-11-22 04:23:27.89892391 +0000 UTC m=+959.241545097" Nov 22 04:23:28 crc kubenswrapper[4699]: I1122 04:23:28.035191 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-qcxmj" Nov 22 04:23:28 crc kubenswrapper[4699]: I1122 04:23:28.853637 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-zd2r8" event={"ID":"aca6ad44-aa04-4178-ab59-bfdec68e49e7","Type":"ContainerStarted","Data":"dbdab7aba49d5591dc53eee59bd75a8d2afb161f19aaab3df5134ffc54de6284"} Nov 22 04:23:28 crc kubenswrapper[4699]: I1122 04:23:28.854112 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-zd2r8" Nov 22 04:23:28 crc kubenswrapper[4699]: I1122 04:23:28.855543 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-58f887965d-qsszz" event={"ID":"072faef9-c4a0-4bf9-84a8-fadca8945449","Type":"ContainerStarted","Data":"6665c1fd90076aa739de392bd92b6ef20b60b1827925cfc3f9baa1ee32105428"} Nov 22 04:23:28 crc kubenswrapper[4699]: I1122 04:23:28.855978 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-58f887965d-qsszz" Nov 22 04:23:28 crc kubenswrapper[4699]: I1122 04:23:28.871009 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-zd2r8" podStartSLOduration=3.051535591 podStartE2EDuration="22.87099187s" podCreationTimestamp="2025-11-22 04:23:06 +0000 UTC" firstStartedPulling="2025-11-22 04:23:08.777897263 +0000 UTC m=+940.120518450" lastFinishedPulling="2025-11-22 04:23:28.597353522 +0000 UTC m=+959.939974729" observedRunningTime="2025-11-22 04:23:28.868201122 +0000 UTC m=+960.210822329" watchObservedRunningTime="2025-11-22 04:23:28.87099187 +0000 UTC m=+960.213613057" Nov 22 04:23:28 crc kubenswrapper[4699]: I1122 04:23:28.882726 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-58f887965d-qsszz" podStartSLOduration=3.0585148 podStartE2EDuration="22.882709003s" podCreationTimestamp="2025-11-22 04:23:06 +0000 UTC" firstStartedPulling="2025-11-22 04:23:08.792982958 +0000 UTC m=+940.135604145" lastFinishedPulling="2025-11-22 04:23:28.617177161 +0000 UTC m=+959.959798348" observedRunningTime="2025-11-22 04:23:28.880153041 +0000 UTC m=+960.222774228" watchObservedRunningTime="2025-11-22 04:23:28.882709003 +0000 UTC m=+960.225330190" Nov 22 04:23:37 crc kubenswrapper[4699]: I1122 04:23:37.274854 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-58f887965d-qsszz" Nov 22 04:23:37 crc kubenswrapper[4699]: I1122 04:23:37.294657 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-zd2r8" Nov 22 04:23:37 crc kubenswrapper[4699]: I1122 04:23:37.320167 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-qlcpr" Nov 22 04:23:37 crc kubenswrapper[4699]: I1122 04:23:37.484945 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-d656998f4-fcjt6" Nov 22 04:23:37 crc kubenswrapper[4699]: I1122 04:23:37.629447 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-7rck7" Nov 22 04:23:37 crc kubenswrapper[4699]: I1122 04:23:37.743893 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-b4c496f69-j5w8t" Nov 22 04:23:59 crc kubenswrapper[4699]: I1122 04:23:58.682351 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-wkcgq"] Nov 22 04:23:59 crc kubenswrapper[4699]: I1122 04:23:58.687389 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-wkcgq" Nov 22 04:23:59 crc kubenswrapper[4699]: I1122 04:23:58.691078 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-4pnbb" Nov 22 04:23:59 crc kubenswrapper[4699]: I1122 04:23:58.691239 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Nov 22 04:23:59 crc kubenswrapper[4699]: I1122 04:23:58.692272 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-wkcgq"] Nov 22 04:23:59 crc kubenswrapper[4699]: I1122 04:23:58.694725 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Nov 22 04:23:59 crc kubenswrapper[4699]: I1122 04:23:58.695117 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Nov 22 04:23:59 crc kubenswrapper[4699]: I1122 04:23:58.736042 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4jkql"] Nov 22 04:23:59 crc kubenswrapper[4699]: I1122 04:23:58.737586 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4jkql" Nov 22 04:23:59 crc kubenswrapper[4699]: I1122 04:23:58.739802 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Nov 22 04:23:59 crc kubenswrapper[4699]: I1122 04:23:58.747130 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4jkql"] Nov 22 04:23:59 crc kubenswrapper[4699]: I1122 04:23:58.759565 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwl29\" (UniqueName: \"kubernetes.io/projected/c116b90d-3133-4d05-90d8-196fd5be31f7-kube-api-access-lwl29\") pod \"dnsmasq-dns-675f4bcbfc-wkcgq\" (UID: \"c116b90d-3133-4d05-90d8-196fd5be31f7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-wkcgq" Nov 22 04:23:59 crc kubenswrapper[4699]: I1122 04:23:58.759616 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c116b90d-3133-4d05-90d8-196fd5be31f7-config\") pod \"dnsmasq-dns-675f4bcbfc-wkcgq\" (UID: \"c116b90d-3133-4d05-90d8-196fd5be31f7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-wkcgq" Nov 22 04:23:59 crc kubenswrapper[4699]: I1122 04:23:58.861269 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c116b90d-3133-4d05-90d8-196fd5be31f7-config\") pod \"dnsmasq-dns-675f4bcbfc-wkcgq\" (UID: \"c116b90d-3133-4d05-90d8-196fd5be31f7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-wkcgq" Nov 22 04:23:59 crc kubenswrapper[4699]: I1122 04:23:58.861400 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gzbx\" (UniqueName: \"kubernetes.io/projected/6c13e026-7a43-4305-8f54-56fd074b250f-kube-api-access-5gzbx\") pod \"dnsmasq-dns-78dd6ddcc-4jkql\" (UID: \"6c13e026-7a43-4305-8f54-56fd074b250f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4jkql" Nov 22 04:23:59 crc kubenswrapper[4699]: I1122 04:23:58.861455 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c13e026-7a43-4305-8f54-56fd074b250f-config\") pod \"dnsmasq-dns-78dd6ddcc-4jkql\" (UID: \"6c13e026-7a43-4305-8f54-56fd074b250f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4jkql" Nov 22 04:23:59 crc kubenswrapper[4699]: I1122 04:23:58.861482 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c13e026-7a43-4305-8f54-56fd074b250f-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-4jkql\" (UID: \"6c13e026-7a43-4305-8f54-56fd074b250f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4jkql" Nov 22 04:23:59 crc kubenswrapper[4699]: I1122 04:23:58.861509 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwl29\" (UniqueName: \"kubernetes.io/projected/c116b90d-3133-4d05-90d8-196fd5be31f7-kube-api-access-lwl29\") pod \"dnsmasq-dns-675f4bcbfc-wkcgq\" (UID: \"c116b90d-3133-4d05-90d8-196fd5be31f7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-wkcgq" Nov 22 04:23:59 crc kubenswrapper[4699]: I1122 04:23:58.863018 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c116b90d-3133-4d05-90d8-196fd5be31f7-config\") pod \"dnsmasq-dns-675f4bcbfc-wkcgq\" (UID: \"c116b90d-3133-4d05-90d8-196fd5be31f7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-wkcgq" Nov 22 04:23:59 crc kubenswrapper[4699]: I1122 04:23:58.886003 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwl29\" (UniqueName: \"kubernetes.io/projected/c116b90d-3133-4d05-90d8-196fd5be31f7-kube-api-access-lwl29\") pod \"dnsmasq-dns-675f4bcbfc-wkcgq\" (UID: \"c116b90d-3133-4d05-90d8-196fd5be31f7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-wkcgq" Nov 22 04:23:59 crc kubenswrapper[4699]: I1122 04:23:58.962734 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gzbx\" (UniqueName: \"kubernetes.io/projected/6c13e026-7a43-4305-8f54-56fd074b250f-kube-api-access-5gzbx\") pod \"dnsmasq-dns-78dd6ddcc-4jkql\" (UID: \"6c13e026-7a43-4305-8f54-56fd074b250f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4jkql" Nov 22 04:23:59 crc kubenswrapper[4699]: I1122 04:23:58.963101 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c13e026-7a43-4305-8f54-56fd074b250f-config\") pod \"dnsmasq-dns-78dd6ddcc-4jkql\" (UID: \"6c13e026-7a43-4305-8f54-56fd074b250f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4jkql" Nov 22 04:23:59 crc kubenswrapper[4699]: I1122 04:23:58.963123 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c13e026-7a43-4305-8f54-56fd074b250f-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-4jkql\" (UID: \"6c13e026-7a43-4305-8f54-56fd074b250f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4jkql" Nov 22 04:23:59 crc kubenswrapper[4699]: I1122 04:23:58.964025 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c13e026-7a43-4305-8f54-56fd074b250f-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-4jkql\" (UID: \"6c13e026-7a43-4305-8f54-56fd074b250f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4jkql" Nov 22 04:23:59 crc kubenswrapper[4699]: I1122 04:23:58.964098 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c13e026-7a43-4305-8f54-56fd074b250f-config\") pod \"dnsmasq-dns-78dd6ddcc-4jkql\" (UID: \"6c13e026-7a43-4305-8f54-56fd074b250f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4jkql" Nov 22 04:23:59 crc kubenswrapper[4699]: I1122 04:23:58.980312 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gzbx\" (UniqueName: \"kubernetes.io/projected/6c13e026-7a43-4305-8f54-56fd074b250f-kube-api-access-5gzbx\") pod \"dnsmasq-dns-78dd6ddcc-4jkql\" (UID: \"6c13e026-7a43-4305-8f54-56fd074b250f\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4jkql" Nov 22 04:23:59 crc kubenswrapper[4699]: I1122 04:23:59.007673 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-wkcgq" Nov 22 04:23:59 crc kubenswrapper[4699]: I1122 04:23:59.054261 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4jkql" Nov 22 04:24:00 crc kubenswrapper[4699]: I1122 04:24:00.015042 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-wkcgq"] Nov 22 04:24:00 crc kubenswrapper[4699]: I1122 04:24:00.039574 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4jkql"] Nov 22 04:24:00 crc kubenswrapper[4699]: I1122 04:24:00.099080 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-4jkql" event={"ID":"6c13e026-7a43-4305-8f54-56fd074b250f","Type":"ContainerStarted","Data":"a1cd9db0b1c91cd6e4a77a681c4e78ea2dea15f71a2fb81fea073701ca886de1"} Nov 22 04:24:00 crc kubenswrapper[4699]: I1122 04:24:00.110340 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-wkcgq" event={"ID":"c116b90d-3133-4d05-90d8-196fd5be31f7","Type":"ContainerStarted","Data":"077de632befd974f35489e89b771a2393ae4449de1eb0ebd59c16805c08bc1f4"} Nov 22 04:24:01 crc kubenswrapper[4699]: I1122 04:24:01.865311 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-wkcgq"] Nov 22 04:24:01 crc kubenswrapper[4699]: I1122 04:24:01.891779 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-27dtv"] Nov 22 04:24:01 crc kubenswrapper[4699]: I1122 04:24:01.893227 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-27dtv" Nov 22 04:24:01 crc kubenswrapper[4699]: I1122 04:24:01.910619 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-27dtv"] Nov 22 04:24:02 crc kubenswrapper[4699]: I1122 04:24:02.013267 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn48z\" (UniqueName: \"kubernetes.io/projected/eada2686-6d38-4401-b287-0fef18eb37f4-kube-api-access-kn48z\") pod \"dnsmasq-dns-666b6646f7-27dtv\" (UID: \"eada2686-6d38-4401-b287-0fef18eb37f4\") " pod="openstack/dnsmasq-dns-666b6646f7-27dtv" Nov 22 04:24:02 crc kubenswrapper[4699]: I1122 04:24:02.013337 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eada2686-6d38-4401-b287-0fef18eb37f4-dns-svc\") pod \"dnsmasq-dns-666b6646f7-27dtv\" (UID: \"eada2686-6d38-4401-b287-0fef18eb37f4\") " pod="openstack/dnsmasq-dns-666b6646f7-27dtv" Nov 22 04:24:02 crc kubenswrapper[4699]: I1122 04:24:02.013366 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eada2686-6d38-4401-b287-0fef18eb37f4-config\") pod \"dnsmasq-dns-666b6646f7-27dtv\" (UID: \"eada2686-6d38-4401-b287-0fef18eb37f4\") " pod="openstack/dnsmasq-dns-666b6646f7-27dtv" Nov 22 04:24:02 crc kubenswrapper[4699]: I1122 04:24:02.114296 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn48z\" (UniqueName: \"kubernetes.io/projected/eada2686-6d38-4401-b287-0fef18eb37f4-kube-api-access-kn48z\") pod \"dnsmasq-dns-666b6646f7-27dtv\" (UID: \"eada2686-6d38-4401-b287-0fef18eb37f4\") " pod="openstack/dnsmasq-dns-666b6646f7-27dtv" Nov 22 04:24:02 crc kubenswrapper[4699]: I1122 04:24:02.114371 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eada2686-6d38-4401-b287-0fef18eb37f4-dns-svc\") pod \"dnsmasq-dns-666b6646f7-27dtv\" (UID: \"eada2686-6d38-4401-b287-0fef18eb37f4\") " pod="openstack/dnsmasq-dns-666b6646f7-27dtv" Nov 22 04:24:02 crc kubenswrapper[4699]: I1122 04:24:02.114403 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eada2686-6d38-4401-b287-0fef18eb37f4-config\") pod \"dnsmasq-dns-666b6646f7-27dtv\" (UID: \"eada2686-6d38-4401-b287-0fef18eb37f4\") " pod="openstack/dnsmasq-dns-666b6646f7-27dtv" Nov 22 04:24:02 crc kubenswrapper[4699]: I1122 04:24:02.115378 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eada2686-6d38-4401-b287-0fef18eb37f4-config\") pod \"dnsmasq-dns-666b6646f7-27dtv\" (UID: \"eada2686-6d38-4401-b287-0fef18eb37f4\") " pod="openstack/dnsmasq-dns-666b6646f7-27dtv" Nov 22 04:24:02 crc kubenswrapper[4699]: I1122 04:24:02.116217 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eada2686-6d38-4401-b287-0fef18eb37f4-dns-svc\") pod \"dnsmasq-dns-666b6646f7-27dtv\" (UID: \"eada2686-6d38-4401-b287-0fef18eb37f4\") " pod="openstack/dnsmasq-dns-666b6646f7-27dtv" Nov 22 04:24:02 crc kubenswrapper[4699]: I1122 04:24:02.151535 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn48z\" (UniqueName: \"kubernetes.io/projected/eada2686-6d38-4401-b287-0fef18eb37f4-kube-api-access-kn48z\") pod \"dnsmasq-dns-666b6646f7-27dtv\" (UID: \"eada2686-6d38-4401-b287-0fef18eb37f4\") " pod="openstack/dnsmasq-dns-666b6646f7-27dtv" Nov 22 04:24:02 crc kubenswrapper[4699]: I1122 04:24:02.213159 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-27dtv" Nov 22 04:24:02 crc kubenswrapper[4699]: I1122 04:24:02.243850 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4jkql"] Nov 22 04:24:02 crc kubenswrapper[4699]: I1122 04:24:02.273772 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8tbxw"] Nov 22 04:24:02 crc kubenswrapper[4699]: I1122 04:24:02.279640 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-8tbxw" Nov 22 04:24:02 crc kubenswrapper[4699]: I1122 04:24:02.296814 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8tbxw"] Nov 22 04:24:02 crc kubenswrapper[4699]: I1122 04:24:02.420981 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e788c26f-418d-49bc-901f-b529753f6fde-config\") pod \"dnsmasq-dns-57d769cc4f-8tbxw\" (UID: \"e788c26f-418d-49bc-901f-b529753f6fde\") " pod="openstack/dnsmasq-dns-57d769cc4f-8tbxw" Nov 22 04:24:02 crc kubenswrapper[4699]: I1122 04:24:02.421059 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndmsc\" (UniqueName: \"kubernetes.io/projected/e788c26f-418d-49bc-901f-b529753f6fde-kube-api-access-ndmsc\") pod \"dnsmasq-dns-57d769cc4f-8tbxw\" (UID: \"e788c26f-418d-49bc-901f-b529753f6fde\") " pod="openstack/dnsmasq-dns-57d769cc4f-8tbxw" Nov 22 04:24:02 crc kubenswrapper[4699]: I1122 04:24:02.421099 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e788c26f-418d-49bc-901f-b529753f6fde-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-8tbxw\" (UID: \"e788c26f-418d-49bc-901f-b529753f6fde\") " pod="openstack/dnsmasq-dns-57d769cc4f-8tbxw" Nov 22 04:24:02 crc kubenswrapper[4699]: I1122 04:24:02.525154 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e788c26f-418d-49bc-901f-b529753f6fde-config\") pod \"dnsmasq-dns-57d769cc4f-8tbxw\" (UID: \"e788c26f-418d-49bc-901f-b529753f6fde\") " pod="openstack/dnsmasq-dns-57d769cc4f-8tbxw" Nov 22 04:24:02 crc kubenswrapper[4699]: I1122 04:24:02.525831 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndmsc\" (UniqueName: \"kubernetes.io/projected/e788c26f-418d-49bc-901f-b529753f6fde-kube-api-access-ndmsc\") pod \"dnsmasq-dns-57d769cc4f-8tbxw\" (UID: \"e788c26f-418d-49bc-901f-b529753f6fde\") " pod="openstack/dnsmasq-dns-57d769cc4f-8tbxw" Nov 22 04:24:02 crc kubenswrapper[4699]: I1122 04:24:02.525915 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e788c26f-418d-49bc-901f-b529753f6fde-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-8tbxw\" (UID: \"e788c26f-418d-49bc-901f-b529753f6fde\") " pod="openstack/dnsmasq-dns-57d769cc4f-8tbxw" Nov 22 04:24:02 crc kubenswrapper[4699]: I1122 04:24:02.527349 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e788c26f-418d-49bc-901f-b529753f6fde-config\") pod \"dnsmasq-dns-57d769cc4f-8tbxw\" (UID: \"e788c26f-418d-49bc-901f-b529753f6fde\") " pod="openstack/dnsmasq-dns-57d769cc4f-8tbxw" Nov 22 04:24:02 crc kubenswrapper[4699]: I1122 04:24:02.528601 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e788c26f-418d-49bc-901f-b529753f6fde-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-8tbxw\" (UID: \"e788c26f-418d-49bc-901f-b529753f6fde\") " pod="openstack/dnsmasq-dns-57d769cc4f-8tbxw" Nov 22 04:24:02 crc kubenswrapper[4699]: I1122 04:24:02.573091 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndmsc\" (UniqueName: \"kubernetes.io/projected/e788c26f-418d-49bc-901f-b529753f6fde-kube-api-access-ndmsc\") pod \"dnsmasq-dns-57d769cc4f-8tbxw\" (UID: \"e788c26f-418d-49bc-901f-b529753f6fde\") " pod="openstack/dnsmasq-dns-57d769cc4f-8tbxw" Nov 22 04:24:02 crc kubenswrapper[4699]: I1122 04:24:02.614016 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-8tbxw" Nov 22 04:24:02 crc kubenswrapper[4699]: I1122 04:24:02.901795 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8tbxw"] Nov 22 04:24:02 crc kubenswrapper[4699]: I1122 04:24:02.912507 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-27dtv"] Nov 22 04:24:02 crc kubenswrapper[4699]: W1122 04:24:02.912693 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode788c26f_418d_49bc_901f_b529753f6fde.slice/crio-e1a2d6952478443eeb997d5e5c86234f078a32936d60c4c874d7a98b805bfeac WatchSource:0}: Error finding container e1a2d6952478443eeb997d5e5c86234f078a32936d60c4c874d7a98b805bfeac: Status 404 returned error can't find the container with id e1a2d6952478443eeb997d5e5c86234f078a32936d60c4c874d7a98b805bfeac Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.079571 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.083843 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.088478 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-vbc4v" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.088574 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.088752 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.088783 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.089049 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.089229 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.089446 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.099826 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.139031 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/964a7a4a-f709-43ea-85f2-93a8273d503d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"964a7a4a-f709-43ea-85f2-93a8273d503d\") " pod="openstack/rabbitmq-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.139085 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/964a7a4a-f709-43ea-85f2-93a8273d503d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"964a7a4a-f709-43ea-85f2-93a8273d503d\") " pod="openstack/rabbitmq-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.139144 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"964a7a4a-f709-43ea-85f2-93a8273d503d\") " pod="openstack/rabbitmq-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.139176 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdzj9\" (UniqueName: \"kubernetes.io/projected/964a7a4a-f709-43ea-85f2-93a8273d503d-kube-api-access-xdzj9\") pod \"rabbitmq-server-0\" (UID: \"964a7a4a-f709-43ea-85f2-93a8273d503d\") " pod="openstack/rabbitmq-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.139197 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/964a7a4a-f709-43ea-85f2-93a8273d503d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"964a7a4a-f709-43ea-85f2-93a8273d503d\") " pod="openstack/rabbitmq-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.139217 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/964a7a4a-f709-43ea-85f2-93a8273d503d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"964a7a4a-f709-43ea-85f2-93a8273d503d\") " pod="openstack/rabbitmq-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.139256 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/964a7a4a-f709-43ea-85f2-93a8273d503d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"964a7a4a-f709-43ea-85f2-93a8273d503d\") " pod="openstack/rabbitmq-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.139280 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/964a7a4a-f709-43ea-85f2-93a8273d503d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"964a7a4a-f709-43ea-85f2-93a8273d503d\") " pod="openstack/rabbitmq-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.139307 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/964a7a4a-f709-43ea-85f2-93a8273d503d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"964a7a4a-f709-43ea-85f2-93a8273d503d\") " pod="openstack/rabbitmq-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.139345 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/964a7a4a-f709-43ea-85f2-93a8273d503d-config-data\") pod \"rabbitmq-server-0\" (UID: \"964a7a4a-f709-43ea-85f2-93a8273d503d\") " pod="openstack/rabbitmq-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.139366 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/964a7a4a-f709-43ea-85f2-93a8273d503d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"964a7a4a-f709-43ea-85f2-93a8273d503d\") " pod="openstack/rabbitmq-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.147564 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-8tbxw" event={"ID":"e788c26f-418d-49bc-901f-b529753f6fde","Type":"ContainerStarted","Data":"e1a2d6952478443eeb997d5e5c86234f078a32936d60c4c874d7a98b805bfeac"} Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.149636 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-27dtv" event={"ID":"eada2686-6d38-4401-b287-0fef18eb37f4","Type":"ContainerStarted","Data":"0680cdfeb40a7802c30151d87252ec1f9fed36d2dbf3c9e481d514052119e496"} Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.240533 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/964a7a4a-f709-43ea-85f2-93a8273d503d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"964a7a4a-f709-43ea-85f2-93a8273d503d\") " pod="openstack/rabbitmq-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.240585 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/964a7a4a-f709-43ea-85f2-93a8273d503d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"964a7a4a-f709-43ea-85f2-93a8273d503d\") " pod="openstack/rabbitmq-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.240611 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/964a7a4a-f709-43ea-85f2-93a8273d503d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"964a7a4a-f709-43ea-85f2-93a8273d503d\") " pod="openstack/rabbitmq-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.240647 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/964a7a4a-f709-43ea-85f2-93a8273d503d-config-data\") pod \"rabbitmq-server-0\" (UID: \"964a7a4a-f709-43ea-85f2-93a8273d503d\") " pod="openstack/rabbitmq-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.240664 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/964a7a4a-f709-43ea-85f2-93a8273d503d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"964a7a4a-f709-43ea-85f2-93a8273d503d\") " pod="openstack/rabbitmq-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.240687 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/964a7a4a-f709-43ea-85f2-93a8273d503d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"964a7a4a-f709-43ea-85f2-93a8273d503d\") " pod="openstack/rabbitmq-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.240707 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/964a7a4a-f709-43ea-85f2-93a8273d503d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"964a7a4a-f709-43ea-85f2-93a8273d503d\") " pod="openstack/rabbitmq-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.240747 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"964a7a4a-f709-43ea-85f2-93a8273d503d\") " pod="openstack/rabbitmq-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.240772 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdzj9\" (UniqueName: \"kubernetes.io/projected/964a7a4a-f709-43ea-85f2-93a8273d503d-kube-api-access-xdzj9\") pod \"rabbitmq-server-0\" (UID: \"964a7a4a-f709-43ea-85f2-93a8273d503d\") " pod="openstack/rabbitmq-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.240788 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/964a7a4a-f709-43ea-85f2-93a8273d503d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"964a7a4a-f709-43ea-85f2-93a8273d503d\") " pod="openstack/rabbitmq-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.240804 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/964a7a4a-f709-43ea-85f2-93a8273d503d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"964a7a4a-f709-43ea-85f2-93a8273d503d\") " pod="openstack/rabbitmq-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.241224 4699 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"964a7a4a-f709-43ea-85f2-93a8273d503d\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.243028 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/964a7a4a-f709-43ea-85f2-93a8273d503d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"964a7a4a-f709-43ea-85f2-93a8273d503d\") " pod="openstack/rabbitmq-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.243037 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/964a7a4a-f709-43ea-85f2-93a8273d503d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"964a7a4a-f709-43ea-85f2-93a8273d503d\") " pod="openstack/rabbitmq-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.244789 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/964a7a4a-f709-43ea-85f2-93a8273d503d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"964a7a4a-f709-43ea-85f2-93a8273d503d\") " pod="openstack/rabbitmq-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.244818 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/964a7a4a-f709-43ea-85f2-93a8273d503d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"964a7a4a-f709-43ea-85f2-93a8273d503d\") " pod="openstack/rabbitmq-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.245383 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/964a7a4a-f709-43ea-85f2-93a8273d503d-config-data\") pod \"rabbitmq-server-0\" (UID: \"964a7a4a-f709-43ea-85f2-93a8273d503d\") " pod="openstack/rabbitmq-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.247616 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/964a7a4a-f709-43ea-85f2-93a8273d503d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"964a7a4a-f709-43ea-85f2-93a8273d503d\") " pod="openstack/rabbitmq-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.249498 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/964a7a4a-f709-43ea-85f2-93a8273d503d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"964a7a4a-f709-43ea-85f2-93a8273d503d\") " pod="openstack/rabbitmq-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.250112 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/964a7a4a-f709-43ea-85f2-93a8273d503d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"964a7a4a-f709-43ea-85f2-93a8273d503d\") " pod="openstack/rabbitmq-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.253146 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/964a7a4a-f709-43ea-85f2-93a8273d503d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"964a7a4a-f709-43ea-85f2-93a8273d503d\") " pod="openstack/rabbitmq-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.267165 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdzj9\" (UniqueName: \"kubernetes.io/projected/964a7a4a-f709-43ea-85f2-93a8273d503d-kube-api-access-xdzj9\") pod \"rabbitmq-server-0\" (UID: \"964a7a4a-f709-43ea-85f2-93a8273d503d\") " pod="openstack/rabbitmq-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.270701 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"964a7a4a-f709-43ea-85f2-93a8273d503d\") " pod="openstack/rabbitmq-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.399361 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.400577 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.402999 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.412846 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.412881 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.413621 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.414058 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.415154 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.417113 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.422831 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-s5km4" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.438062 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.443639 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/43d42bf1-de55-49eb-990f-451ad31d0e21-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"43d42bf1-de55-49eb-990f-451ad31d0e21\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.443684 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/43d42bf1-de55-49eb-990f-451ad31d0e21-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"43d42bf1-de55-49eb-990f-451ad31d0e21\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.443708 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/43d42bf1-de55-49eb-990f-451ad31d0e21-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"43d42bf1-de55-49eb-990f-451ad31d0e21\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.443734 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/43d42bf1-de55-49eb-990f-451ad31d0e21-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"43d42bf1-de55-49eb-990f-451ad31d0e21\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.443751 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/43d42bf1-de55-49eb-990f-451ad31d0e21-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"43d42bf1-de55-49eb-990f-451ad31d0e21\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.443771 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"43d42bf1-de55-49eb-990f-451ad31d0e21\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.443788 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/43d42bf1-de55-49eb-990f-451ad31d0e21-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"43d42bf1-de55-49eb-990f-451ad31d0e21\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.443808 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/43d42bf1-de55-49eb-990f-451ad31d0e21-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"43d42bf1-de55-49eb-990f-451ad31d0e21\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.443830 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/43d42bf1-de55-49eb-990f-451ad31d0e21-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"43d42bf1-de55-49eb-990f-451ad31d0e21\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.443853 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pql4w\" (UniqueName: \"kubernetes.io/projected/43d42bf1-de55-49eb-990f-451ad31d0e21-kube-api-access-pql4w\") pod \"rabbitmq-cell1-server-0\" (UID: \"43d42bf1-de55-49eb-990f-451ad31d0e21\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.443881 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/43d42bf1-de55-49eb-990f-451ad31d0e21-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"43d42bf1-de55-49eb-990f-451ad31d0e21\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.545147 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/43d42bf1-de55-49eb-990f-451ad31d0e21-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"43d42bf1-de55-49eb-990f-451ad31d0e21\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.545268 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/43d42bf1-de55-49eb-990f-451ad31d0e21-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"43d42bf1-de55-49eb-990f-451ad31d0e21\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.545292 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/43d42bf1-de55-49eb-990f-451ad31d0e21-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"43d42bf1-de55-49eb-990f-451ad31d0e21\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.545310 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/43d42bf1-de55-49eb-990f-451ad31d0e21-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"43d42bf1-de55-49eb-990f-451ad31d0e21\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.545346 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/43d42bf1-de55-49eb-990f-451ad31d0e21-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"43d42bf1-de55-49eb-990f-451ad31d0e21\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.545379 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/43d42bf1-de55-49eb-990f-451ad31d0e21-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"43d42bf1-de55-49eb-990f-451ad31d0e21\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.545404 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"43d42bf1-de55-49eb-990f-451ad31d0e21\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.545458 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/43d42bf1-de55-49eb-990f-451ad31d0e21-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"43d42bf1-de55-49eb-990f-451ad31d0e21\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.545503 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/43d42bf1-de55-49eb-990f-451ad31d0e21-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"43d42bf1-de55-49eb-990f-451ad31d0e21\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.545572 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/43d42bf1-de55-49eb-990f-451ad31d0e21-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"43d42bf1-de55-49eb-990f-451ad31d0e21\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.545624 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pql4w\" (UniqueName: \"kubernetes.io/projected/43d42bf1-de55-49eb-990f-451ad31d0e21-kube-api-access-pql4w\") pod \"rabbitmq-cell1-server-0\" (UID: \"43d42bf1-de55-49eb-990f-451ad31d0e21\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.547778 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/43d42bf1-de55-49eb-990f-451ad31d0e21-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"43d42bf1-de55-49eb-990f-451ad31d0e21\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.548629 4699 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"43d42bf1-de55-49eb-990f-451ad31d0e21\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.548703 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/43d42bf1-de55-49eb-990f-451ad31d0e21-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"43d42bf1-de55-49eb-990f-451ad31d0e21\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.549094 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/43d42bf1-de55-49eb-990f-451ad31d0e21-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"43d42bf1-de55-49eb-990f-451ad31d0e21\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.550698 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/43d42bf1-de55-49eb-990f-451ad31d0e21-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"43d42bf1-de55-49eb-990f-451ad31d0e21\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.550762 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/43d42bf1-de55-49eb-990f-451ad31d0e21-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"43d42bf1-de55-49eb-990f-451ad31d0e21\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.550837 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/43d42bf1-de55-49eb-990f-451ad31d0e21-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"43d42bf1-de55-49eb-990f-451ad31d0e21\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.554176 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/43d42bf1-de55-49eb-990f-451ad31d0e21-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"43d42bf1-de55-49eb-990f-451ad31d0e21\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.560827 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/43d42bf1-de55-49eb-990f-451ad31d0e21-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"43d42bf1-de55-49eb-990f-451ad31d0e21\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.560892 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/43d42bf1-de55-49eb-990f-451ad31d0e21-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"43d42bf1-de55-49eb-990f-451ad31d0e21\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.564339 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pql4w\" (UniqueName: \"kubernetes.io/projected/43d42bf1-de55-49eb-990f-451ad31d0e21-kube-api-access-pql4w\") pod \"rabbitmq-cell1-server-0\" (UID: \"43d42bf1-de55-49eb-990f-451ad31d0e21\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.599936 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"43d42bf1-de55-49eb-990f-451ad31d0e21\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.724154 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:24:03 crc kubenswrapper[4699]: I1122 04:24:03.948298 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 04:24:03 crc kubenswrapper[4699]: W1122 04:24:03.967013 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod964a7a4a_f709_43ea_85f2_93a8273d503d.slice/crio-851562825c295b58e1efb42ba4c22c75ff571041dfc856a2c7150d6f21a2b299 WatchSource:0}: Error finding container 851562825c295b58e1efb42ba4c22c75ff571041dfc856a2c7150d6f21a2b299: Status 404 returned error can't find the container with id 851562825c295b58e1efb42ba4c22c75ff571041dfc856a2c7150d6f21a2b299 Nov 22 04:24:04 crc kubenswrapper[4699]: I1122 04:24:04.168180 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"964a7a4a-f709-43ea-85f2-93a8273d503d","Type":"ContainerStarted","Data":"851562825c295b58e1efb42ba4c22c75ff571041dfc856a2c7150d6f21a2b299"} Nov 22 04:24:04 crc kubenswrapper[4699]: I1122 04:24:04.292557 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 04:24:04 crc kubenswrapper[4699]: I1122 04:24:04.944302 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Nov 22 04:24:04 crc kubenswrapper[4699]: I1122 04:24:04.946755 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 22 04:24:04 crc kubenswrapper[4699]: I1122 04:24:04.950459 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-wx69n" Nov 22 04:24:04 crc kubenswrapper[4699]: I1122 04:24:04.951235 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Nov 22 04:24:04 crc kubenswrapper[4699]: I1122 04:24:04.951617 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Nov 22 04:24:04 crc kubenswrapper[4699]: I1122 04:24:04.951820 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Nov 22 04:24:04 crc kubenswrapper[4699]: I1122 04:24:04.952150 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Nov 22 04:24:04 crc kubenswrapper[4699]: I1122 04:24:04.969478 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Nov 22 04:24:05 crc kubenswrapper[4699]: I1122 04:24:05.079563 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/57084326-d72e-40cb-9905-ca75d50f51e3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"57084326-d72e-40cb-9905-ca75d50f51e3\") " pod="openstack/openstack-galera-0" Nov 22 04:24:05 crc kubenswrapper[4699]: I1122 04:24:05.079613 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57084326-d72e-40cb-9905-ca75d50f51e3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"57084326-d72e-40cb-9905-ca75d50f51e3\") " pod="openstack/openstack-galera-0" Nov 22 04:24:05 crc kubenswrapper[4699]: I1122 04:24:05.079659 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/57084326-d72e-40cb-9905-ca75d50f51e3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"57084326-d72e-40cb-9905-ca75d50f51e3\") " pod="openstack/openstack-galera-0" Nov 22 04:24:05 crc kubenswrapper[4699]: I1122 04:24:05.079689 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/57084326-d72e-40cb-9905-ca75d50f51e3-kolla-config\") pod \"openstack-galera-0\" (UID: \"57084326-d72e-40cb-9905-ca75d50f51e3\") " pod="openstack/openstack-galera-0" Nov 22 04:24:05 crc kubenswrapper[4699]: I1122 04:24:05.079735 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcbfm\" (UniqueName: \"kubernetes.io/projected/57084326-d72e-40cb-9905-ca75d50f51e3-kube-api-access-tcbfm\") pod \"openstack-galera-0\" (UID: \"57084326-d72e-40cb-9905-ca75d50f51e3\") " pod="openstack/openstack-galera-0" Nov 22 04:24:05 crc kubenswrapper[4699]: I1122 04:24:05.079765 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57084326-d72e-40cb-9905-ca75d50f51e3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"57084326-d72e-40cb-9905-ca75d50f51e3\") " pod="openstack/openstack-galera-0" Nov 22 04:24:05 crc kubenswrapper[4699]: I1122 04:24:05.079784 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"57084326-d72e-40cb-9905-ca75d50f51e3\") " pod="openstack/openstack-galera-0" Nov 22 04:24:05 crc kubenswrapper[4699]: I1122 04:24:05.079816 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/57084326-d72e-40cb-9905-ca75d50f51e3-config-data-default\") pod \"openstack-galera-0\" (UID: \"57084326-d72e-40cb-9905-ca75d50f51e3\") " pod="openstack/openstack-galera-0" Nov 22 04:24:05 crc kubenswrapper[4699]: I1122 04:24:05.181068 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/57084326-d72e-40cb-9905-ca75d50f51e3-kolla-config\") pod \"openstack-galera-0\" (UID: \"57084326-d72e-40cb-9905-ca75d50f51e3\") " pod="openstack/openstack-galera-0" Nov 22 04:24:05 crc kubenswrapper[4699]: I1122 04:24:05.181169 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcbfm\" (UniqueName: \"kubernetes.io/projected/57084326-d72e-40cb-9905-ca75d50f51e3-kube-api-access-tcbfm\") pod \"openstack-galera-0\" (UID: \"57084326-d72e-40cb-9905-ca75d50f51e3\") " pod="openstack/openstack-galera-0" Nov 22 04:24:05 crc kubenswrapper[4699]: I1122 04:24:05.181212 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57084326-d72e-40cb-9905-ca75d50f51e3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"57084326-d72e-40cb-9905-ca75d50f51e3\") " pod="openstack/openstack-galera-0" Nov 22 04:24:05 crc kubenswrapper[4699]: I1122 04:24:05.181239 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"57084326-d72e-40cb-9905-ca75d50f51e3\") " pod="openstack/openstack-galera-0" Nov 22 04:24:05 crc kubenswrapper[4699]: I1122 04:24:05.181275 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/57084326-d72e-40cb-9905-ca75d50f51e3-config-data-default\") pod \"openstack-galera-0\" (UID: \"57084326-d72e-40cb-9905-ca75d50f51e3\") " pod="openstack/openstack-galera-0" Nov 22 04:24:05 crc kubenswrapper[4699]: I1122 04:24:05.181306 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/57084326-d72e-40cb-9905-ca75d50f51e3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"57084326-d72e-40cb-9905-ca75d50f51e3\") " pod="openstack/openstack-galera-0" Nov 22 04:24:05 crc kubenswrapper[4699]: I1122 04:24:05.181324 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57084326-d72e-40cb-9905-ca75d50f51e3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"57084326-d72e-40cb-9905-ca75d50f51e3\") " pod="openstack/openstack-galera-0" Nov 22 04:24:05 crc kubenswrapper[4699]: I1122 04:24:05.181375 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/57084326-d72e-40cb-9905-ca75d50f51e3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"57084326-d72e-40cb-9905-ca75d50f51e3\") " pod="openstack/openstack-galera-0" Nov 22 04:24:05 crc kubenswrapper[4699]: I1122 04:24:05.182735 4699 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"57084326-d72e-40cb-9905-ca75d50f51e3\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/openstack-galera-0" Nov 22 04:24:05 crc kubenswrapper[4699]: I1122 04:24:05.185545 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/57084326-d72e-40cb-9905-ca75d50f51e3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"57084326-d72e-40cb-9905-ca75d50f51e3\") " pod="openstack/openstack-galera-0" Nov 22 04:24:05 crc kubenswrapper[4699]: I1122 04:24:05.185830 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/57084326-d72e-40cb-9905-ca75d50f51e3-kolla-config\") pod \"openstack-galera-0\" (UID: \"57084326-d72e-40cb-9905-ca75d50f51e3\") " pod="openstack/openstack-galera-0" Nov 22 04:24:05 crc kubenswrapper[4699]: I1122 04:24:05.186170 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/57084326-d72e-40cb-9905-ca75d50f51e3-config-data-default\") pod \"openstack-galera-0\" (UID: \"57084326-d72e-40cb-9905-ca75d50f51e3\") " pod="openstack/openstack-galera-0" Nov 22 04:24:05 crc kubenswrapper[4699]: I1122 04:24:05.187357 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57084326-d72e-40cb-9905-ca75d50f51e3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"57084326-d72e-40cb-9905-ca75d50f51e3\") " pod="openstack/openstack-galera-0" Nov 22 04:24:05 crc kubenswrapper[4699]: I1122 04:24:05.223662 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"43d42bf1-de55-49eb-990f-451ad31d0e21","Type":"ContainerStarted","Data":"150006f975a9f60f659431357eac4f00f61ab9231d38d7e5a06fc25b10419734"} Nov 22 04:24:05 crc kubenswrapper[4699]: I1122 04:24:05.230203 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcbfm\" (UniqueName: \"kubernetes.io/projected/57084326-d72e-40cb-9905-ca75d50f51e3-kube-api-access-tcbfm\") pod \"openstack-galera-0\" (UID: \"57084326-d72e-40cb-9905-ca75d50f51e3\") " pod="openstack/openstack-galera-0" Nov 22 04:24:05 crc kubenswrapper[4699]: I1122 04:24:05.251955 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/57084326-d72e-40cb-9905-ca75d50f51e3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"57084326-d72e-40cb-9905-ca75d50f51e3\") " pod="openstack/openstack-galera-0" Nov 22 04:24:05 crc kubenswrapper[4699]: I1122 04:24:05.257166 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"57084326-d72e-40cb-9905-ca75d50f51e3\") " pod="openstack/openstack-galera-0" Nov 22 04:24:05 crc kubenswrapper[4699]: I1122 04:24:05.262347 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57084326-d72e-40cb-9905-ca75d50f51e3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"57084326-d72e-40cb-9905-ca75d50f51e3\") " pod="openstack/openstack-galera-0" Nov 22 04:24:05 crc kubenswrapper[4699]: I1122 04:24:05.273807 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.056334 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Nov 22 04:24:06 crc kubenswrapper[4699]: W1122 04:24:06.090490 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57084326_d72e_40cb_9905_ca75d50f51e3.slice/crio-8cbc80d7d418363488723bb956ec753eeb76894340011b6bf5f54c653a58c934 WatchSource:0}: Error finding container 8cbc80d7d418363488723bb956ec753eeb76894340011b6bf5f54c653a58c934: Status 404 returned error can't find the container with id 8cbc80d7d418363488723bb956ec753eeb76894340011b6bf5f54c653a58c934 Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.281778 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"57084326-d72e-40cb-9905-ca75d50f51e3","Type":"ContainerStarted","Data":"8cbc80d7d418363488723bb956ec753eeb76894340011b6bf5f54c653a58c934"} Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.436988 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.438787 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.447167 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.447530 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.447734 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-z9km6" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.455672 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.457328 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.459987 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.460659 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-4s5pq" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.460822 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.460965 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.463798 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.471745 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.632038 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/02e377d7-9e5a-45ec-9460-16af64ce3db5-kolla-config\") pod \"memcached-0\" (UID: \"02e377d7-9e5a-45ec-9460-16af64ce3db5\") " pod="openstack/memcached-0" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.632103 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/02e377d7-9e5a-45ec-9460-16af64ce3db5-memcached-tls-certs\") pod \"memcached-0\" (UID: \"02e377d7-9e5a-45ec-9460-16af64ce3db5\") " pod="openstack/memcached-0" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.632122 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/02e377d7-9e5a-45ec-9460-16af64ce3db5-config-data\") pod \"memcached-0\" (UID: \"02e377d7-9e5a-45ec-9460-16af64ce3db5\") " pod="openstack/memcached-0" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.632931 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e74585bc-d1cf-473d-95ca-12c816ff0020-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e74585bc-d1cf-473d-95ca-12c816ff0020\") " pod="openstack/openstack-cell1-galera-0" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.632996 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e74585bc-d1cf-473d-95ca-12c816ff0020-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e74585bc-d1cf-473d-95ca-12c816ff0020\") " pod="openstack/openstack-cell1-galera-0" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.633068 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e74585bc-d1cf-473d-95ca-12c816ff0020-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e74585bc-d1cf-473d-95ca-12c816ff0020\") " pod="openstack/openstack-cell1-galera-0" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.633143 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e74585bc-d1cf-473d-95ca-12c816ff0020\") " pod="openstack/openstack-cell1-galera-0" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.633337 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02e377d7-9e5a-45ec-9460-16af64ce3db5-combined-ca-bundle\") pod \"memcached-0\" (UID: \"02e377d7-9e5a-45ec-9460-16af64ce3db5\") " pod="openstack/memcached-0" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.633366 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc9hs\" (UniqueName: \"kubernetes.io/projected/02e377d7-9e5a-45ec-9460-16af64ce3db5-kube-api-access-qc9hs\") pod \"memcached-0\" (UID: \"02e377d7-9e5a-45ec-9460-16af64ce3db5\") " pod="openstack/memcached-0" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.633384 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5v2w\" (UniqueName: \"kubernetes.io/projected/e74585bc-d1cf-473d-95ca-12c816ff0020-kube-api-access-v5v2w\") pod \"openstack-cell1-galera-0\" (UID: \"e74585bc-d1cf-473d-95ca-12c816ff0020\") " pod="openstack/openstack-cell1-galera-0" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.633402 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e74585bc-d1cf-473d-95ca-12c816ff0020-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e74585bc-d1cf-473d-95ca-12c816ff0020\") " pod="openstack/openstack-cell1-galera-0" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.633428 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e74585bc-d1cf-473d-95ca-12c816ff0020-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e74585bc-d1cf-473d-95ca-12c816ff0020\") " pod="openstack/openstack-cell1-galera-0" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.633513 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e74585bc-d1cf-473d-95ca-12c816ff0020-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e74585bc-d1cf-473d-95ca-12c816ff0020\") " pod="openstack/openstack-cell1-galera-0" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.736454 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e74585bc-d1cf-473d-95ca-12c816ff0020-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e74585bc-d1cf-473d-95ca-12c816ff0020\") " pod="openstack/openstack-cell1-galera-0" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.736531 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e74585bc-d1cf-473d-95ca-12c816ff0020\") " pod="openstack/openstack-cell1-galera-0" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.736555 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02e377d7-9e5a-45ec-9460-16af64ce3db5-combined-ca-bundle\") pod \"memcached-0\" (UID: \"02e377d7-9e5a-45ec-9460-16af64ce3db5\") " pod="openstack/memcached-0" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.736576 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qc9hs\" (UniqueName: \"kubernetes.io/projected/02e377d7-9e5a-45ec-9460-16af64ce3db5-kube-api-access-qc9hs\") pod \"memcached-0\" (UID: \"02e377d7-9e5a-45ec-9460-16af64ce3db5\") " pod="openstack/memcached-0" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.736692 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5v2w\" (UniqueName: \"kubernetes.io/projected/e74585bc-d1cf-473d-95ca-12c816ff0020-kube-api-access-v5v2w\") pod \"openstack-cell1-galera-0\" (UID: \"e74585bc-d1cf-473d-95ca-12c816ff0020\") " pod="openstack/openstack-cell1-galera-0" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.736712 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e74585bc-d1cf-473d-95ca-12c816ff0020-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e74585bc-d1cf-473d-95ca-12c816ff0020\") " pod="openstack/openstack-cell1-galera-0" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.736757 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e74585bc-d1cf-473d-95ca-12c816ff0020-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e74585bc-d1cf-473d-95ca-12c816ff0020\") " pod="openstack/openstack-cell1-galera-0" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.736774 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e74585bc-d1cf-473d-95ca-12c816ff0020-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e74585bc-d1cf-473d-95ca-12c816ff0020\") " pod="openstack/openstack-cell1-galera-0" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.736809 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/02e377d7-9e5a-45ec-9460-16af64ce3db5-kolla-config\") pod \"memcached-0\" (UID: \"02e377d7-9e5a-45ec-9460-16af64ce3db5\") " pod="openstack/memcached-0" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.736830 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/02e377d7-9e5a-45ec-9460-16af64ce3db5-memcached-tls-certs\") pod \"memcached-0\" (UID: \"02e377d7-9e5a-45ec-9460-16af64ce3db5\") " pod="openstack/memcached-0" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.736846 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/02e377d7-9e5a-45ec-9460-16af64ce3db5-config-data\") pod \"memcached-0\" (UID: \"02e377d7-9e5a-45ec-9460-16af64ce3db5\") " pod="openstack/memcached-0" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.736864 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e74585bc-d1cf-473d-95ca-12c816ff0020-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e74585bc-d1cf-473d-95ca-12c816ff0020\") " pod="openstack/openstack-cell1-galera-0" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.736883 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e74585bc-d1cf-473d-95ca-12c816ff0020-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e74585bc-d1cf-473d-95ca-12c816ff0020\") " pod="openstack/openstack-cell1-galera-0" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.737692 4699 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e74585bc-d1cf-473d-95ca-12c816ff0020\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/openstack-cell1-galera-0" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.738261 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/02e377d7-9e5a-45ec-9460-16af64ce3db5-kolla-config\") pod \"memcached-0\" (UID: \"02e377d7-9e5a-45ec-9460-16af64ce3db5\") " pod="openstack/memcached-0" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.739106 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e74585bc-d1cf-473d-95ca-12c816ff0020-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e74585bc-d1cf-473d-95ca-12c816ff0020\") " pod="openstack/openstack-cell1-galera-0" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.739125 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e74585bc-d1cf-473d-95ca-12c816ff0020-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e74585bc-d1cf-473d-95ca-12c816ff0020\") " pod="openstack/openstack-cell1-galera-0" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.739293 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/02e377d7-9e5a-45ec-9460-16af64ce3db5-config-data\") pod \"memcached-0\" (UID: \"02e377d7-9e5a-45ec-9460-16af64ce3db5\") " pod="openstack/memcached-0" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.739823 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e74585bc-d1cf-473d-95ca-12c816ff0020-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e74585bc-d1cf-473d-95ca-12c816ff0020\") " pod="openstack/openstack-cell1-galera-0" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.740878 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e74585bc-d1cf-473d-95ca-12c816ff0020-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e74585bc-d1cf-473d-95ca-12c816ff0020\") " pod="openstack/openstack-cell1-galera-0" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.755537 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/02e377d7-9e5a-45ec-9460-16af64ce3db5-memcached-tls-certs\") pod \"memcached-0\" (UID: \"02e377d7-9e5a-45ec-9460-16af64ce3db5\") " pod="openstack/memcached-0" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.755547 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e74585bc-d1cf-473d-95ca-12c816ff0020-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e74585bc-d1cf-473d-95ca-12c816ff0020\") " pod="openstack/openstack-cell1-galera-0" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.755625 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02e377d7-9e5a-45ec-9460-16af64ce3db5-combined-ca-bundle\") pod \"memcached-0\" (UID: \"02e377d7-9e5a-45ec-9460-16af64ce3db5\") " pod="openstack/memcached-0" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.756006 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e74585bc-d1cf-473d-95ca-12c816ff0020-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e74585bc-d1cf-473d-95ca-12c816ff0020\") " pod="openstack/openstack-cell1-galera-0" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.764991 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc9hs\" (UniqueName: \"kubernetes.io/projected/02e377d7-9e5a-45ec-9460-16af64ce3db5-kube-api-access-qc9hs\") pod \"memcached-0\" (UID: \"02e377d7-9e5a-45ec-9460-16af64ce3db5\") " pod="openstack/memcached-0" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.765633 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5v2w\" (UniqueName: \"kubernetes.io/projected/e74585bc-d1cf-473d-95ca-12c816ff0020-kube-api-access-v5v2w\") pod \"openstack-cell1-galera-0\" (UID: \"e74585bc-d1cf-473d-95ca-12c816ff0020\") " pod="openstack/openstack-cell1-galera-0" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.782774 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e74585bc-d1cf-473d-95ca-12c816ff0020\") " pod="openstack/openstack-cell1-galera-0" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.793822 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 22 04:24:06 crc kubenswrapper[4699]: I1122 04:24:06.818689 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 22 04:24:07 crc kubenswrapper[4699]: I1122 04:24:07.363607 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Nov 22 04:24:07 crc kubenswrapper[4699]: W1122 04:24:07.393761 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02e377d7_9e5a_45ec_9460_16af64ce3db5.slice/crio-4040bf4a9c47be011180dcfaf3a3a8e0e6b3c62836b9459bd55c379e7684bc0c WatchSource:0}: Error finding container 4040bf4a9c47be011180dcfaf3a3a8e0e6b3c62836b9459bd55c379e7684bc0c: Status 404 returned error can't find the container with id 4040bf4a9c47be011180dcfaf3a3a8e0e6b3c62836b9459bd55c379e7684bc0c Nov 22 04:24:07 crc kubenswrapper[4699]: I1122 04:24:07.480425 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 22 04:24:08 crc kubenswrapper[4699]: I1122 04:24:08.344408 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"02e377d7-9e5a-45ec-9460-16af64ce3db5","Type":"ContainerStarted","Data":"4040bf4a9c47be011180dcfaf3a3a8e0e6b3c62836b9459bd55c379e7684bc0c"} Nov 22 04:24:08 crc kubenswrapper[4699]: I1122 04:24:08.348376 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e74585bc-d1cf-473d-95ca-12c816ff0020","Type":"ContainerStarted","Data":"e8f4b55259330ea6cf9ae94c8e1f4dba5146dcb636eb03f8ea39c69ac143bbce"} Nov 22 04:24:08 crc kubenswrapper[4699]: I1122 04:24:08.440321 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 04:24:08 crc kubenswrapper[4699]: I1122 04:24:08.441737 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 22 04:24:08 crc kubenswrapper[4699]: I1122 04:24:08.461004 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-9w6kc" Nov 22 04:24:08 crc kubenswrapper[4699]: I1122 04:24:08.488334 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 04:24:08 crc kubenswrapper[4699]: I1122 04:24:08.581714 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqfmk\" (UniqueName: \"kubernetes.io/projected/c970cf2e-16a0-42fe-ba32-ee217bd82db8-kube-api-access-kqfmk\") pod \"kube-state-metrics-0\" (UID: \"c970cf2e-16a0-42fe-ba32-ee217bd82db8\") " pod="openstack/kube-state-metrics-0" Nov 22 04:24:08 crc kubenswrapper[4699]: I1122 04:24:08.683069 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqfmk\" (UniqueName: \"kubernetes.io/projected/c970cf2e-16a0-42fe-ba32-ee217bd82db8-kube-api-access-kqfmk\") pod \"kube-state-metrics-0\" (UID: \"c970cf2e-16a0-42fe-ba32-ee217bd82db8\") " pod="openstack/kube-state-metrics-0" Nov 22 04:24:08 crc kubenswrapper[4699]: I1122 04:24:08.702723 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqfmk\" (UniqueName: \"kubernetes.io/projected/c970cf2e-16a0-42fe-ba32-ee217bd82db8-kube-api-access-kqfmk\") pod \"kube-state-metrics-0\" (UID: \"c970cf2e-16a0-42fe-ba32-ee217bd82db8\") " pod="openstack/kube-state-metrics-0" Nov 22 04:24:08 crc kubenswrapper[4699]: I1122 04:24:08.725839 4699 patch_prober.go:28] interesting pod/machine-config-daemon-kjwnt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:24:08 crc kubenswrapper[4699]: I1122 04:24:08.725898 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:24:08 crc kubenswrapper[4699]: I1122 04:24:08.817370 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 22 04:24:12 crc kubenswrapper[4699]: I1122 04:24:12.920888 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-s7mlz"] Nov 22 04:24:12 crc kubenswrapper[4699]: I1122 04:24:12.922374 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s7mlz" Nov 22 04:24:12 crc kubenswrapper[4699]: I1122 04:24:12.925600 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-lpskt" Nov 22 04:24:12 crc kubenswrapper[4699]: I1122 04:24:12.925820 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Nov 22 04:24:12 crc kubenswrapper[4699]: I1122 04:24:12.925965 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Nov 22 04:24:12 crc kubenswrapper[4699]: I1122 04:24:12.928614 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-j7b96"] Nov 22 04:24:12 crc kubenswrapper[4699]: I1122 04:24:12.930548 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-j7b96" Nov 22 04:24:12 crc kubenswrapper[4699]: I1122 04:24:12.946767 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-s7mlz"] Nov 22 04:24:12 crc kubenswrapper[4699]: I1122 04:24:12.950552 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-j7b96"] Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.063726 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/0311366c-c8c7-449c-b617-213a4d87de00-ovn-controller-tls-certs\") pod \"ovn-controller-s7mlz\" (UID: \"0311366c-c8c7-449c-b617-213a4d87de00\") " pod="openstack/ovn-controller-s7mlz" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.063782 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0311366c-c8c7-449c-b617-213a4d87de00-combined-ca-bundle\") pod \"ovn-controller-s7mlz\" (UID: \"0311366c-c8c7-449c-b617-213a4d87de00\") " pod="openstack/ovn-controller-s7mlz" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.063804 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0311366c-c8c7-449c-b617-213a4d87de00-var-run\") pod \"ovn-controller-s7mlz\" (UID: \"0311366c-c8c7-449c-b617-213a4d87de00\") " pod="openstack/ovn-controller-s7mlz" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.063831 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/55053527-f2d2-4e44-8a9c-153b74ef3605-etc-ovs\") pod \"ovn-controller-ovs-j7b96\" (UID: \"55053527-f2d2-4e44-8a9c-153b74ef3605\") " pod="openstack/ovn-controller-ovs-j7b96" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.063851 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/55053527-f2d2-4e44-8a9c-153b74ef3605-var-log\") pod \"ovn-controller-ovs-j7b96\" (UID: \"55053527-f2d2-4e44-8a9c-153b74ef3605\") " pod="openstack/ovn-controller-ovs-j7b96" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.063869 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/55053527-f2d2-4e44-8a9c-153b74ef3605-scripts\") pod \"ovn-controller-ovs-j7b96\" (UID: \"55053527-f2d2-4e44-8a9c-153b74ef3605\") " pod="openstack/ovn-controller-ovs-j7b96" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.063891 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0311366c-c8c7-449c-b617-213a4d87de00-var-log-ovn\") pod \"ovn-controller-s7mlz\" (UID: \"0311366c-c8c7-449c-b617-213a4d87de00\") " pod="openstack/ovn-controller-s7mlz" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.063920 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/55053527-f2d2-4e44-8a9c-153b74ef3605-var-run\") pod \"ovn-controller-ovs-j7b96\" (UID: \"55053527-f2d2-4e44-8a9c-153b74ef3605\") " pod="openstack/ovn-controller-ovs-j7b96" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.063937 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmx8g\" (UniqueName: \"kubernetes.io/projected/55053527-f2d2-4e44-8a9c-153b74ef3605-kube-api-access-nmx8g\") pod \"ovn-controller-ovs-j7b96\" (UID: \"55053527-f2d2-4e44-8a9c-153b74ef3605\") " pod="openstack/ovn-controller-ovs-j7b96" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.063961 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzh4j\" (UniqueName: \"kubernetes.io/projected/0311366c-c8c7-449c-b617-213a4d87de00-kube-api-access-kzh4j\") pod \"ovn-controller-s7mlz\" (UID: \"0311366c-c8c7-449c-b617-213a4d87de00\") " pod="openstack/ovn-controller-s7mlz" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.063990 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0311366c-c8c7-449c-b617-213a4d87de00-var-run-ovn\") pod \"ovn-controller-s7mlz\" (UID: \"0311366c-c8c7-449c-b617-213a4d87de00\") " pod="openstack/ovn-controller-s7mlz" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.064005 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/55053527-f2d2-4e44-8a9c-153b74ef3605-var-lib\") pod \"ovn-controller-ovs-j7b96\" (UID: \"55053527-f2d2-4e44-8a9c-153b74ef3605\") " pod="openstack/ovn-controller-ovs-j7b96" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.064054 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0311366c-c8c7-449c-b617-213a4d87de00-scripts\") pod \"ovn-controller-s7mlz\" (UID: \"0311366c-c8c7-449c-b617-213a4d87de00\") " pod="openstack/ovn-controller-s7mlz" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.165219 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/55053527-f2d2-4e44-8a9c-153b74ef3605-var-run\") pod \"ovn-controller-ovs-j7b96\" (UID: \"55053527-f2d2-4e44-8a9c-153b74ef3605\") " pod="openstack/ovn-controller-ovs-j7b96" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.165877 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmx8g\" (UniqueName: \"kubernetes.io/projected/55053527-f2d2-4e44-8a9c-153b74ef3605-kube-api-access-nmx8g\") pod \"ovn-controller-ovs-j7b96\" (UID: \"55053527-f2d2-4e44-8a9c-153b74ef3605\") " pod="openstack/ovn-controller-ovs-j7b96" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.165915 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzh4j\" (UniqueName: \"kubernetes.io/projected/0311366c-c8c7-449c-b617-213a4d87de00-kube-api-access-kzh4j\") pod \"ovn-controller-s7mlz\" (UID: \"0311366c-c8c7-449c-b617-213a4d87de00\") " pod="openstack/ovn-controller-s7mlz" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.165949 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0311366c-c8c7-449c-b617-213a4d87de00-var-run-ovn\") pod \"ovn-controller-s7mlz\" (UID: \"0311366c-c8c7-449c-b617-213a4d87de00\") " pod="openstack/ovn-controller-s7mlz" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.165965 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/55053527-f2d2-4e44-8a9c-153b74ef3605-var-lib\") pod \"ovn-controller-ovs-j7b96\" (UID: \"55053527-f2d2-4e44-8a9c-153b74ef3605\") " pod="openstack/ovn-controller-ovs-j7b96" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.166012 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0311366c-c8c7-449c-b617-213a4d87de00-scripts\") pod \"ovn-controller-s7mlz\" (UID: \"0311366c-c8c7-449c-b617-213a4d87de00\") " pod="openstack/ovn-controller-s7mlz" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.166026 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/0311366c-c8c7-449c-b617-213a4d87de00-ovn-controller-tls-certs\") pod \"ovn-controller-s7mlz\" (UID: \"0311366c-c8c7-449c-b617-213a4d87de00\") " pod="openstack/ovn-controller-s7mlz" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.166046 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0311366c-c8c7-449c-b617-213a4d87de00-combined-ca-bundle\") pod \"ovn-controller-s7mlz\" (UID: \"0311366c-c8c7-449c-b617-213a4d87de00\") " pod="openstack/ovn-controller-s7mlz" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.166065 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0311366c-c8c7-449c-b617-213a4d87de00-var-run\") pod \"ovn-controller-s7mlz\" (UID: \"0311366c-c8c7-449c-b617-213a4d87de00\") " pod="openstack/ovn-controller-s7mlz" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.166084 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/55053527-f2d2-4e44-8a9c-153b74ef3605-etc-ovs\") pod \"ovn-controller-ovs-j7b96\" (UID: \"55053527-f2d2-4e44-8a9c-153b74ef3605\") " pod="openstack/ovn-controller-ovs-j7b96" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.166103 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/55053527-f2d2-4e44-8a9c-153b74ef3605-var-log\") pod \"ovn-controller-ovs-j7b96\" (UID: \"55053527-f2d2-4e44-8a9c-153b74ef3605\") " pod="openstack/ovn-controller-ovs-j7b96" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.166123 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/55053527-f2d2-4e44-8a9c-153b74ef3605-scripts\") pod \"ovn-controller-ovs-j7b96\" (UID: \"55053527-f2d2-4e44-8a9c-153b74ef3605\") " pod="openstack/ovn-controller-ovs-j7b96" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.166143 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0311366c-c8c7-449c-b617-213a4d87de00-var-log-ovn\") pod \"ovn-controller-s7mlz\" (UID: \"0311366c-c8c7-449c-b617-213a4d87de00\") " pod="openstack/ovn-controller-s7mlz" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.166291 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0311366c-c8c7-449c-b617-213a4d87de00-var-log-ovn\") pod \"ovn-controller-s7mlz\" (UID: \"0311366c-c8c7-449c-b617-213a4d87de00\") " pod="openstack/ovn-controller-s7mlz" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.165841 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/55053527-f2d2-4e44-8a9c-153b74ef3605-var-run\") pod \"ovn-controller-ovs-j7b96\" (UID: \"55053527-f2d2-4e44-8a9c-153b74ef3605\") " pod="openstack/ovn-controller-ovs-j7b96" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.166827 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0311366c-c8c7-449c-b617-213a4d87de00-var-run-ovn\") pod \"ovn-controller-s7mlz\" (UID: \"0311366c-c8c7-449c-b617-213a4d87de00\") " pod="openstack/ovn-controller-s7mlz" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.166950 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/55053527-f2d2-4e44-8a9c-153b74ef3605-var-lib\") pod \"ovn-controller-ovs-j7b96\" (UID: \"55053527-f2d2-4e44-8a9c-153b74ef3605\") " pod="openstack/ovn-controller-ovs-j7b96" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.169006 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0311366c-c8c7-449c-b617-213a4d87de00-scripts\") pod \"ovn-controller-s7mlz\" (UID: \"0311366c-c8c7-449c-b617-213a4d87de00\") " pod="openstack/ovn-controller-s7mlz" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.169985 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/55053527-f2d2-4e44-8a9c-153b74ef3605-etc-ovs\") pod \"ovn-controller-ovs-j7b96\" (UID: \"55053527-f2d2-4e44-8a9c-153b74ef3605\") " pod="openstack/ovn-controller-ovs-j7b96" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.170302 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0311366c-c8c7-449c-b617-213a4d87de00-var-run\") pod \"ovn-controller-s7mlz\" (UID: \"0311366c-c8c7-449c-b617-213a4d87de00\") " pod="openstack/ovn-controller-s7mlz" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.170454 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/55053527-f2d2-4e44-8a9c-153b74ef3605-var-log\") pod \"ovn-controller-ovs-j7b96\" (UID: \"55053527-f2d2-4e44-8a9c-153b74ef3605\") " pod="openstack/ovn-controller-ovs-j7b96" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.175016 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0311366c-c8c7-449c-b617-213a4d87de00-combined-ca-bundle\") pod \"ovn-controller-s7mlz\" (UID: \"0311366c-c8c7-449c-b617-213a4d87de00\") " pod="openstack/ovn-controller-s7mlz" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.175500 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/0311366c-c8c7-449c-b617-213a4d87de00-ovn-controller-tls-certs\") pod \"ovn-controller-s7mlz\" (UID: \"0311366c-c8c7-449c-b617-213a4d87de00\") " pod="openstack/ovn-controller-s7mlz" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.177842 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/55053527-f2d2-4e44-8a9c-153b74ef3605-scripts\") pod \"ovn-controller-ovs-j7b96\" (UID: \"55053527-f2d2-4e44-8a9c-153b74ef3605\") " pod="openstack/ovn-controller-ovs-j7b96" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.186136 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzh4j\" (UniqueName: \"kubernetes.io/projected/0311366c-c8c7-449c-b617-213a4d87de00-kube-api-access-kzh4j\") pod \"ovn-controller-s7mlz\" (UID: \"0311366c-c8c7-449c-b617-213a4d87de00\") " pod="openstack/ovn-controller-s7mlz" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.187772 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmx8g\" (UniqueName: \"kubernetes.io/projected/55053527-f2d2-4e44-8a9c-153b74ef3605-kube-api-access-nmx8g\") pod \"ovn-controller-ovs-j7b96\" (UID: \"55053527-f2d2-4e44-8a9c-153b74ef3605\") " pod="openstack/ovn-controller-ovs-j7b96" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.252571 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s7mlz" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.266523 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-j7b96" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.807871 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.809750 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.813579 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.813618 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.813735 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.814136 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-89qjh" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.814376 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.819965 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.978962 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa\") " pod="openstack/ovsdbserver-nb-0" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.979024 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa\") " pod="openstack/ovsdbserver-nb-0" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.979060 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa\") " pod="openstack/ovsdbserver-nb-0" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.979088 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa-config\") pod \"ovsdbserver-nb-0\" (UID: \"fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa\") " pod="openstack/ovsdbserver-nb-0" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.979168 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa\") " pod="openstack/ovsdbserver-nb-0" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.979275 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa\") " pod="openstack/ovsdbserver-nb-0" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.979293 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa\") " pod="openstack/ovsdbserver-nb-0" Nov 22 04:24:13 crc kubenswrapper[4699]: I1122 04:24:13.979349 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47g8g\" (UniqueName: \"kubernetes.io/projected/fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa-kube-api-access-47g8g\") pod \"ovsdbserver-nb-0\" (UID: \"fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa\") " pod="openstack/ovsdbserver-nb-0" Nov 22 04:24:14 crc kubenswrapper[4699]: I1122 04:24:14.081055 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47g8g\" (UniqueName: \"kubernetes.io/projected/fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa-kube-api-access-47g8g\") pod \"ovsdbserver-nb-0\" (UID: \"fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa\") " pod="openstack/ovsdbserver-nb-0" Nov 22 04:24:14 crc kubenswrapper[4699]: I1122 04:24:14.081131 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa\") " pod="openstack/ovsdbserver-nb-0" Nov 22 04:24:14 crc kubenswrapper[4699]: I1122 04:24:14.081170 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa\") " pod="openstack/ovsdbserver-nb-0" Nov 22 04:24:14 crc kubenswrapper[4699]: I1122 04:24:14.081199 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa\") " pod="openstack/ovsdbserver-nb-0" Nov 22 04:24:14 crc kubenswrapper[4699]: I1122 04:24:14.081224 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa-config\") pod \"ovsdbserver-nb-0\" (UID: \"fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa\") " pod="openstack/ovsdbserver-nb-0" Nov 22 04:24:14 crc kubenswrapper[4699]: I1122 04:24:14.081247 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa\") " pod="openstack/ovsdbserver-nb-0" Nov 22 04:24:14 crc kubenswrapper[4699]: I1122 04:24:14.081278 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa\") " pod="openstack/ovsdbserver-nb-0" Nov 22 04:24:14 crc kubenswrapper[4699]: I1122 04:24:14.081296 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa\") " pod="openstack/ovsdbserver-nb-0" Nov 22 04:24:14 crc kubenswrapper[4699]: I1122 04:24:14.081654 4699 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/ovsdbserver-nb-0" Nov 22 04:24:14 crc kubenswrapper[4699]: I1122 04:24:14.082459 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa-config\") pod \"ovsdbserver-nb-0\" (UID: \"fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa\") " pod="openstack/ovsdbserver-nb-0" Nov 22 04:24:14 crc kubenswrapper[4699]: I1122 04:24:14.083343 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa\") " pod="openstack/ovsdbserver-nb-0" Nov 22 04:24:14 crc kubenswrapper[4699]: I1122 04:24:14.083672 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa\") " pod="openstack/ovsdbserver-nb-0" Nov 22 04:24:14 crc kubenswrapper[4699]: I1122 04:24:14.086069 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa\") " pod="openstack/ovsdbserver-nb-0" Nov 22 04:24:14 crc kubenswrapper[4699]: I1122 04:24:14.086558 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa\") " pod="openstack/ovsdbserver-nb-0" Nov 22 04:24:14 crc kubenswrapper[4699]: I1122 04:24:14.088639 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa\") " pod="openstack/ovsdbserver-nb-0" Nov 22 04:24:14 crc kubenswrapper[4699]: I1122 04:24:14.102755 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47g8g\" (UniqueName: \"kubernetes.io/projected/fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa-kube-api-access-47g8g\") pod \"ovsdbserver-nb-0\" (UID: \"fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa\") " pod="openstack/ovsdbserver-nb-0" Nov 22 04:24:14 crc kubenswrapper[4699]: I1122 04:24:14.123599 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa\") " pod="openstack/ovsdbserver-nb-0" Nov 22 04:24:14 crc kubenswrapper[4699]: I1122 04:24:14.135324 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 22 04:24:16 crc kubenswrapper[4699]: I1122 04:24:16.082760 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 22 04:24:16 crc kubenswrapper[4699]: I1122 04:24:16.084633 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 22 04:24:16 crc kubenswrapper[4699]: I1122 04:24:16.088218 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Nov 22 04:24:16 crc kubenswrapper[4699]: I1122 04:24:16.088335 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-58lqb" Nov 22 04:24:16 crc kubenswrapper[4699]: I1122 04:24:16.088464 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Nov 22 04:24:16 crc kubenswrapper[4699]: I1122 04:24:16.091927 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Nov 22 04:24:16 crc kubenswrapper[4699]: I1122 04:24:16.098308 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 22 04:24:16 crc kubenswrapper[4699]: I1122 04:24:16.215790 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3e31d684-0292-4e13-8bce-9af3fbcb09cb-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"3e31d684-0292-4e13-8bce-9af3fbcb09cb\") " pod="openstack/ovsdbserver-sb-0" Nov 22 04:24:16 crc kubenswrapper[4699]: I1122 04:24:16.215838 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e31d684-0292-4e13-8bce-9af3fbcb09cb-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"3e31d684-0292-4e13-8bce-9af3fbcb09cb\") " pod="openstack/ovsdbserver-sb-0" Nov 22 04:24:16 crc kubenswrapper[4699]: I1122 04:24:16.215863 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e31d684-0292-4e13-8bce-9af3fbcb09cb-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3e31d684-0292-4e13-8bce-9af3fbcb09cb\") " pod="openstack/ovsdbserver-sb-0" Nov 22 04:24:16 crc kubenswrapper[4699]: I1122 04:24:16.215881 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txgnq\" (UniqueName: \"kubernetes.io/projected/3e31d684-0292-4e13-8bce-9af3fbcb09cb-kube-api-access-txgnq\") pod \"ovsdbserver-sb-0\" (UID: \"3e31d684-0292-4e13-8bce-9af3fbcb09cb\") " pod="openstack/ovsdbserver-sb-0" Nov 22 04:24:16 crc kubenswrapper[4699]: I1122 04:24:16.215904 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e31d684-0292-4e13-8bce-9af3fbcb09cb-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3e31d684-0292-4e13-8bce-9af3fbcb09cb\") " pod="openstack/ovsdbserver-sb-0" Nov 22 04:24:16 crc kubenswrapper[4699]: I1122 04:24:16.215948 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3e31d684-0292-4e13-8bce-9af3fbcb09cb\") " pod="openstack/ovsdbserver-sb-0" Nov 22 04:24:16 crc kubenswrapper[4699]: I1122 04:24:16.215996 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e31d684-0292-4e13-8bce-9af3fbcb09cb-config\") pod \"ovsdbserver-sb-0\" (UID: \"3e31d684-0292-4e13-8bce-9af3fbcb09cb\") " pod="openstack/ovsdbserver-sb-0" Nov 22 04:24:16 crc kubenswrapper[4699]: I1122 04:24:16.216023 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e31d684-0292-4e13-8bce-9af3fbcb09cb-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"3e31d684-0292-4e13-8bce-9af3fbcb09cb\") " pod="openstack/ovsdbserver-sb-0" Nov 22 04:24:16 crc kubenswrapper[4699]: I1122 04:24:16.317788 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e31d684-0292-4e13-8bce-9af3fbcb09cb-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"3e31d684-0292-4e13-8bce-9af3fbcb09cb\") " pod="openstack/ovsdbserver-sb-0" Nov 22 04:24:16 crc kubenswrapper[4699]: I1122 04:24:16.317838 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3e31d684-0292-4e13-8bce-9af3fbcb09cb-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"3e31d684-0292-4e13-8bce-9af3fbcb09cb\") " pod="openstack/ovsdbserver-sb-0" Nov 22 04:24:16 crc kubenswrapper[4699]: I1122 04:24:16.317862 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e31d684-0292-4e13-8bce-9af3fbcb09cb-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"3e31d684-0292-4e13-8bce-9af3fbcb09cb\") " pod="openstack/ovsdbserver-sb-0" Nov 22 04:24:16 crc kubenswrapper[4699]: I1122 04:24:16.317886 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e31d684-0292-4e13-8bce-9af3fbcb09cb-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3e31d684-0292-4e13-8bce-9af3fbcb09cb\") " pod="openstack/ovsdbserver-sb-0" Nov 22 04:24:16 crc kubenswrapper[4699]: I1122 04:24:16.317901 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txgnq\" (UniqueName: \"kubernetes.io/projected/3e31d684-0292-4e13-8bce-9af3fbcb09cb-kube-api-access-txgnq\") pod \"ovsdbserver-sb-0\" (UID: \"3e31d684-0292-4e13-8bce-9af3fbcb09cb\") " pod="openstack/ovsdbserver-sb-0" Nov 22 04:24:16 crc kubenswrapper[4699]: I1122 04:24:16.317923 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e31d684-0292-4e13-8bce-9af3fbcb09cb-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3e31d684-0292-4e13-8bce-9af3fbcb09cb\") " pod="openstack/ovsdbserver-sb-0" Nov 22 04:24:16 crc kubenswrapper[4699]: I1122 04:24:16.317969 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3e31d684-0292-4e13-8bce-9af3fbcb09cb\") " pod="openstack/ovsdbserver-sb-0" Nov 22 04:24:16 crc kubenswrapper[4699]: I1122 04:24:16.318026 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e31d684-0292-4e13-8bce-9af3fbcb09cb-config\") pod \"ovsdbserver-sb-0\" (UID: \"3e31d684-0292-4e13-8bce-9af3fbcb09cb\") " pod="openstack/ovsdbserver-sb-0" Nov 22 04:24:16 crc kubenswrapper[4699]: I1122 04:24:16.318630 4699 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3e31d684-0292-4e13-8bce-9af3fbcb09cb\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/ovsdbserver-sb-0" Nov 22 04:24:16 crc kubenswrapper[4699]: I1122 04:24:16.318837 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3e31d684-0292-4e13-8bce-9af3fbcb09cb-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"3e31d684-0292-4e13-8bce-9af3fbcb09cb\") " pod="openstack/ovsdbserver-sb-0" Nov 22 04:24:16 crc kubenswrapper[4699]: I1122 04:24:16.319064 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e31d684-0292-4e13-8bce-9af3fbcb09cb-config\") pod \"ovsdbserver-sb-0\" (UID: \"3e31d684-0292-4e13-8bce-9af3fbcb09cb\") " pod="openstack/ovsdbserver-sb-0" Nov 22 04:24:16 crc kubenswrapper[4699]: I1122 04:24:16.320029 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e31d684-0292-4e13-8bce-9af3fbcb09cb-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"3e31d684-0292-4e13-8bce-9af3fbcb09cb\") " pod="openstack/ovsdbserver-sb-0" Nov 22 04:24:16 crc kubenswrapper[4699]: I1122 04:24:16.322156 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e31d684-0292-4e13-8bce-9af3fbcb09cb-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3e31d684-0292-4e13-8bce-9af3fbcb09cb\") " pod="openstack/ovsdbserver-sb-0" Nov 22 04:24:16 crc kubenswrapper[4699]: I1122 04:24:16.322511 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e31d684-0292-4e13-8bce-9af3fbcb09cb-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3e31d684-0292-4e13-8bce-9af3fbcb09cb\") " pod="openstack/ovsdbserver-sb-0" Nov 22 04:24:16 crc kubenswrapper[4699]: I1122 04:24:16.323604 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e31d684-0292-4e13-8bce-9af3fbcb09cb-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"3e31d684-0292-4e13-8bce-9af3fbcb09cb\") " pod="openstack/ovsdbserver-sb-0" Nov 22 04:24:16 crc kubenswrapper[4699]: I1122 04:24:16.345119 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txgnq\" (UniqueName: \"kubernetes.io/projected/3e31d684-0292-4e13-8bce-9af3fbcb09cb-kube-api-access-txgnq\") pod \"ovsdbserver-sb-0\" (UID: \"3e31d684-0292-4e13-8bce-9af3fbcb09cb\") " pod="openstack/ovsdbserver-sb-0" Nov 22 04:24:16 crc kubenswrapper[4699]: I1122 04:24:16.347161 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3e31d684-0292-4e13-8bce-9af3fbcb09cb\") " pod="openstack/ovsdbserver-sb-0" Nov 22 04:24:16 crc kubenswrapper[4699]: I1122 04:24:16.415509 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 22 04:24:32 crc kubenswrapper[4699]: E1122 04:24:32.820216 4699 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Nov 22 04:24:32 crc kubenswrapper[4699]: E1122 04:24:32.820814 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tcbfm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(57084326-d72e-40cb-9905-ca75d50f51e3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 04:24:32 crc kubenswrapper[4699]: E1122 04:24:32.822399 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="57084326-d72e-40cb-9905-ca75d50f51e3" Nov 22 04:24:33 crc kubenswrapper[4699]: E1122 04:24:33.529874 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-galera-0" podUID="57084326-d72e-40cb-9905-ca75d50f51e3" Nov 22 04:24:38 crc kubenswrapper[4699]: I1122 04:24:38.726170 4699 patch_prober.go:28] interesting pod/machine-config-daemon-kjwnt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:24:38 crc kubenswrapper[4699]: I1122 04:24:38.726589 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:24:50 crc kubenswrapper[4699]: E1122 04:24:50.592399 4699 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Nov 22 04:24:50 crc kubenswrapper[4699]: E1122 04:24:50.593152 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pql4w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(43d42bf1-de55-49eb-990f-451ad31d0e21): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 04:24:50 crc kubenswrapper[4699]: E1122 04:24:50.594411 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="43d42bf1-de55-49eb-990f-451ad31d0e21" Nov 22 04:24:50 crc kubenswrapper[4699]: E1122 04:24:50.678649 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="43d42bf1-de55-49eb-990f-451ad31d0e21" Nov 22 04:24:50 crc kubenswrapper[4699]: E1122 04:24:50.698500 4699 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Nov 22 04:24:50 crc kubenswrapper[4699]: E1122 04:24:50.698742 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v5v2w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(e74585bc-d1cf-473d-95ca-12c816ff0020): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 04:24:50 crc kubenswrapper[4699]: E1122 04:24:50.700756 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="e74585bc-d1cf-473d-95ca-12c816ff0020" Nov 22 04:24:51 crc kubenswrapper[4699]: E1122 04:24:51.057423 4699 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Nov 22 04:24:51 crc kubenswrapper[4699]: E1122 04:24:51.057650 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xdzj9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(964a7a4a-f709-43ea-85f2-93a8273d503d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 04:24:51 crc kubenswrapper[4699]: E1122 04:24:51.059003 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="964a7a4a-f709-43ea-85f2-93a8273d503d" Nov 22 04:24:51 crc kubenswrapper[4699]: E1122 04:24:51.684788 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-cell1-galera-0" podUID="e74585bc-d1cf-473d-95ca-12c816ff0020" Nov 22 04:24:51 crc kubenswrapper[4699]: E1122 04:24:51.686151 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="964a7a4a-f709-43ea-85f2-93a8273d503d" Nov 22 04:24:51 crc kubenswrapper[4699]: E1122 04:24:51.988632 4699 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Nov 22 04:24:51 crc kubenswrapper[4699]: E1122 04:24:51.989116 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lwl29,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-wkcgq_openstack(c116b90d-3133-4d05-90d8-196fd5be31f7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 04:24:51 crc kubenswrapper[4699]: E1122 04:24:51.990939 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-wkcgq" podUID="c116b90d-3133-4d05-90d8-196fd5be31f7" Nov 22 04:24:52 crc kubenswrapper[4699]: E1122 04:24:52.314015 4699 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Nov 22 04:24:52 crc kubenswrapper[4699]: E1122 04:24:52.314167 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5gzbx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-4jkql_openstack(6c13e026-7a43-4305-8f54-56fd074b250f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 04:24:52 crc kubenswrapper[4699]: E1122 04:24:52.315459 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-4jkql" podUID="6c13e026-7a43-4305-8f54-56fd074b250f" Nov 22 04:24:52 crc kubenswrapper[4699]: E1122 04:24:52.450769 4699 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-memcached:current-podified" Nov 22 04:24:52 crc kubenswrapper[4699]: E1122 04:24:52.450996 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:ncfhf6h648h87h68h674h5d6h654h5bch566h555h6h65ch5f7h577h54h7ch59dh99h595h678hd5h648h697h55fhdh66bh5bdh68bh686h5d9h55dq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qc9hs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(02e377d7-9e5a-45ec-9460-16af64ce3db5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 04:24:52 crc kubenswrapper[4699]: E1122 04:24:52.452180 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="02e377d7-9e5a-45ec-9460-16af64ce3db5" Nov 22 04:24:52 crc kubenswrapper[4699]: E1122 04:24:52.558699 4699 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Nov 22 04:24:52 crc kubenswrapper[4699]: E1122 04:24:52.558905 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kn48z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-27dtv_openstack(eada2686-6d38-4401-b287-0fef18eb37f4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 04:24:52 crc kubenswrapper[4699]: E1122 04:24:52.562457 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-27dtv" podUID="eada2686-6d38-4401-b287-0fef18eb37f4" Nov 22 04:24:52 crc kubenswrapper[4699]: E1122 04:24:52.566861 4699 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Nov 22 04:24:52 crc kubenswrapper[4699]: E1122 04:24:52.567038 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ndmsc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-8tbxw_openstack(e788c26f-418d-49bc-901f-b529753f6fde): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 04:24:52 crc kubenswrapper[4699]: E1122 04:24:52.568321 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-8tbxw" podUID="e788c26f-418d-49bc-901f-b529753f6fde" Nov 22 04:24:52 crc kubenswrapper[4699]: E1122 04:24:52.699801 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-8tbxw" podUID="e788c26f-418d-49bc-901f-b529753f6fde" Nov 22 04:24:52 crc kubenswrapper[4699]: E1122 04:24:52.700136 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-27dtv" podUID="eada2686-6d38-4401-b287-0fef18eb37f4" Nov 22 04:24:52 crc kubenswrapper[4699]: E1122 04:24:52.700250 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-memcached:current-podified\\\"\"" pod="openstack/memcached-0" podUID="02e377d7-9e5a-45ec-9460-16af64ce3db5" Nov 22 04:24:53 crc kubenswrapper[4699]: I1122 04:24:53.081932 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-wkcgq" Nov 22 04:24:53 crc kubenswrapper[4699]: I1122 04:24:53.087549 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4jkql" Nov 22 04:24:53 crc kubenswrapper[4699]: I1122 04:24:53.135212 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c13e026-7a43-4305-8f54-56fd074b250f-dns-svc\") pod \"6c13e026-7a43-4305-8f54-56fd074b250f\" (UID: \"6c13e026-7a43-4305-8f54-56fd074b250f\") " Nov 22 04:24:53 crc kubenswrapper[4699]: I1122 04:24:53.135274 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwl29\" (UniqueName: \"kubernetes.io/projected/c116b90d-3133-4d05-90d8-196fd5be31f7-kube-api-access-lwl29\") pod \"c116b90d-3133-4d05-90d8-196fd5be31f7\" (UID: \"c116b90d-3133-4d05-90d8-196fd5be31f7\") " Nov 22 04:24:53 crc kubenswrapper[4699]: I1122 04:24:53.135313 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gzbx\" (UniqueName: \"kubernetes.io/projected/6c13e026-7a43-4305-8f54-56fd074b250f-kube-api-access-5gzbx\") pod \"6c13e026-7a43-4305-8f54-56fd074b250f\" (UID: \"6c13e026-7a43-4305-8f54-56fd074b250f\") " Nov 22 04:24:53 crc kubenswrapper[4699]: I1122 04:24:53.135596 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c13e026-7a43-4305-8f54-56fd074b250f-config\") pod \"6c13e026-7a43-4305-8f54-56fd074b250f\" (UID: \"6c13e026-7a43-4305-8f54-56fd074b250f\") " Nov 22 04:24:53 crc kubenswrapper[4699]: I1122 04:24:53.135665 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c116b90d-3133-4d05-90d8-196fd5be31f7-config\") pod \"c116b90d-3133-4d05-90d8-196fd5be31f7\" (UID: \"c116b90d-3133-4d05-90d8-196fd5be31f7\") " Nov 22 04:24:53 crc kubenswrapper[4699]: I1122 04:24:53.136129 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c13e026-7a43-4305-8f54-56fd074b250f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6c13e026-7a43-4305-8f54-56fd074b250f" (UID: "6c13e026-7a43-4305-8f54-56fd074b250f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:24:53 crc kubenswrapper[4699]: I1122 04:24:53.136355 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c13e026-7a43-4305-8f54-56fd074b250f-config" (OuterVolumeSpecName: "config") pod "6c13e026-7a43-4305-8f54-56fd074b250f" (UID: "6c13e026-7a43-4305-8f54-56fd074b250f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:24:53 crc kubenswrapper[4699]: I1122 04:24:53.136509 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c116b90d-3133-4d05-90d8-196fd5be31f7-config" (OuterVolumeSpecName: "config") pod "c116b90d-3133-4d05-90d8-196fd5be31f7" (UID: "c116b90d-3133-4d05-90d8-196fd5be31f7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:24:53 crc kubenswrapper[4699]: I1122 04:24:53.142300 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c116b90d-3133-4d05-90d8-196fd5be31f7-kube-api-access-lwl29" (OuterVolumeSpecName: "kube-api-access-lwl29") pod "c116b90d-3133-4d05-90d8-196fd5be31f7" (UID: "c116b90d-3133-4d05-90d8-196fd5be31f7"). InnerVolumeSpecName "kube-api-access-lwl29". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:24:53 crc kubenswrapper[4699]: I1122 04:24:53.142364 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c13e026-7a43-4305-8f54-56fd074b250f-kube-api-access-5gzbx" (OuterVolumeSpecName: "kube-api-access-5gzbx") pod "6c13e026-7a43-4305-8f54-56fd074b250f" (UID: "6c13e026-7a43-4305-8f54-56fd074b250f"). InnerVolumeSpecName "kube-api-access-5gzbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:24:53 crc kubenswrapper[4699]: I1122 04:24:53.185080 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 04:24:53 crc kubenswrapper[4699]: W1122 04:24:53.186649 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc970cf2e_16a0_42fe_ba32_ee217bd82db8.slice/crio-474fca0310fc3d04a4748fc896f81028db9d710e1ca887517833c1044552ebe0 WatchSource:0}: Error finding container 474fca0310fc3d04a4748fc896f81028db9d710e1ca887517833c1044552ebe0: Status 404 returned error can't find the container with id 474fca0310fc3d04a4748fc896f81028db9d710e1ca887517833c1044552ebe0 Nov 22 04:24:53 crc kubenswrapper[4699]: I1122 04:24:53.194381 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-s7mlz"] Nov 22 04:24:53 crc kubenswrapper[4699]: I1122 04:24:53.236948 4699 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c13e026-7a43-4305-8f54-56fd074b250f-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 04:24:53 crc kubenswrapper[4699]: I1122 04:24:53.237285 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwl29\" (UniqueName: \"kubernetes.io/projected/c116b90d-3133-4d05-90d8-196fd5be31f7-kube-api-access-lwl29\") on node \"crc\" DevicePath \"\"" Nov 22 04:24:53 crc kubenswrapper[4699]: I1122 04:24:53.237299 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gzbx\" (UniqueName: \"kubernetes.io/projected/6c13e026-7a43-4305-8f54-56fd074b250f-kube-api-access-5gzbx\") on node \"crc\" DevicePath \"\"" Nov 22 04:24:53 crc kubenswrapper[4699]: I1122 04:24:53.237308 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c13e026-7a43-4305-8f54-56fd074b250f-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:24:53 crc kubenswrapper[4699]: I1122 04:24:53.237317 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c116b90d-3133-4d05-90d8-196fd5be31f7-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:24:53 crc kubenswrapper[4699]: I1122 04:24:53.304708 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 22 04:24:53 crc kubenswrapper[4699]: W1122 04:24:53.307651 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb62cd5c_3d93_4b7d_810c_0ee46c6f90fa.slice/crio-327576e06ce5b4b26b8b0f6bc91bf341bfdbfde844dc98caee284568c82b2686 WatchSource:0}: Error finding container 327576e06ce5b4b26b8b0f6bc91bf341bfdbfde844dc98caee284568c82b2686: Status 404 returned error can't find the container with id 327576e06ce5b4b26b8b0f6bc91bf341bfdbfde844dc98caee284568c82b2686 Nov 22 04:24:53 crc kubenswrapper[4699]: E1122 04:24:53.604839 4699 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc116b90d_3133_4d05_90d8_196fd5be31f7.slice/crio-077de632befd974f35489e89b771a2393ae4449de1eb0ebd59c16805c08bc1f4\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c13e026_7a43_4305_8f54_56fd074b250f.slice/crio-a1cd9db0b1c91cd6e4a77a681c4e78ea2dea15f71a2fb81fea073701ca886de1\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc116b90d_3133_4d05_90d8_196fd5be31f7.slice\": RecentStats: unable to find data in memory cache]" Nov 22 04:24:53 crc kubenswrapper[4699]: I1122 04:24:53.708323 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"57084326-d72e-40cb-9905-ca75d50f51e3","Type":"ContainerStarted","Data":"099bf51e438cc0285fab82f0549e5154b84c62429924ff8376a0149a87b1873c"} Nov 22 04:24:53 crc kubenswrapper[4699]: I1122 04:24:53.711215 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-4jkql" event={"ID":"6c13e026-7a43-4305-8f54-56fd074b250f","Type":"ContainerDied","Data":"a1cd9db0b1c91cd6e4a77a681c4e78ea2dea15f71a2fb81fea073701ca886de1"} Nov 22 04:24:53 crc kubenswrapper[4699]: I1122 04:24:53.712342 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa","Type":"ContainerStarted","Data":"327576e06ce5b4b26b8b0f6bc91bf341bfdbfde844dc98caee284568c82b2686"} Nov 22 04:24:53 crc kubenswrapper[4699]: I1122 04:24:53.712668 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4jkql" Nov 22 04:24:53 crc kubenswrapper[4699]: I1122 04:24:53.713426 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-wkcgq" Nov 22 04:24:53 crc kubenswrapper[4699]: I1122 04:24:53.713936 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-wkcgq" event={"ID":"c116b90d-3133-4d05-90d8-196fd5be31f7","Type":"ContainerDied","Data":"077de632befd974f35489e89b771a2393ae4449de1eb0ebd59c16805c08bc1f4"} Nov 22 04:24:53 crc kubenswrapper[4699]: I1122 04:24:53.717393 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c970cf2e-16a0-42fe-ba32-ee217bd82db8","Type":"ContainerStarted","Data":"474fca0310fc3d04a4748fc896f81028db9d710e1ca887517833c1044552ebe0"} Nov 22 04:24:53 crc kubenswrapper[4699]: I1122 04:24:53.718463 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s7mlz" event={"ID":"0311366c-c8c7-449c-b617-213a4d87de00","Type":"ContainerStarted","Data":"17bc0f6e7043150b67c487b7ce382eab97745b78802b509005e3edb683812633"} Nov 22 04:24:53 crc kubenswrapper[4699]: I1122 04:24:53.766384 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4jkql"] Nov 22 04:24:53 crc kubenswrapper[4699]: I1122 04:24:53.772038 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4jkql"] Nov 22 04:24:53 crc kubenswrapper[4699]: I1122 04:24:53.801310 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-wkcgq"] Nov 22 04:24:53 crc kubenswrapper[4699]: I1122 04:24:53.806353 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-wkcgq"] Nov 22 04:24:54 crc kubenswrapper[4699]: I1122 04:24:54.060188 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-j7b96"] Nov 22 04:24:54 crc kubenswrapper[4699]: I1122 04:24:54.296102 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 22 04:24:54 crc kubenswrapper[4699]: I1122 04:24:54.750870 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-j7b96" event={"ID":"55053527-f2d2-4e44-8a9c-153b74ef3605","Type":"ContainerStarted","Data":"9c0c34d9723bf8df72d44ad90993b5e3f6300662f2f38696fc888e7aca4cd64b"} Nov 22 04:24:55 crc kubenswrapper[4699]: I1122 04:24:55.462856 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c13e026-7a43-4305-8f54-56fd074b250f" path="/var/lib/kubelet/pods/6c13e026-7a43-4305-8f54-56fd074b250f/volumes" Nov 22 04:24:55 crc kubenswrapper[4699]: I1122 04:24:55.463757 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c116b90d-3133-4d05-90d8-196fd5be31f7" path="/var/lib/kubelet/pods/c116b90d-3133-4d05-90d8-196fd5be31f7/volumes" Nov 22 04:24:56 crc kubenswrapper[4699]: W1122 04:24:56.100767 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e31d684_0292_4e13_8bce_9af3fbcb09cb.slice/crio-e58269675f46565c6c2e63ee4b013926f194fecd93db105569ec614dea0841e1 WatchSource:0}: Error finding container e58269675f46565c6c2e63ee4b013926f194fecd93db105569ec614dea0841e1: Status 404 returned error can't find the container with id e58269675f46565c6c2e63ee4b013926f194fecd93db105569ec614dea0841e1 Nov 22 04:24:56 crc kubenswrapper[4699]: I1122 04:24:56.798583 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-x9j67"] Nov 22 04:24:56 crc kubenswrapper[4699]: I1122 04:24:56.800487 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-x9j67" Nov 22 04:24:56 crc kubenswrapper[4699]: I1122 04:24:56.804918 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Nov 22 04:24:56 crc kubenswrapper[4699]: I1122 04:24:56.814292 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"3e31d684-0292-4e13-8bce-9af3fbcb09cb","Type":"ContainerStarted","Data":"e58269675f46565c6c2e63ee4b013926f194fecd93db105569ec614dea0841e1"} Nov 22 04:24:56 crc kubenswrapper[4699]: I1122 04:24:56.815023 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-x9j67"] Nov 22 04:24:56 crc kubenswrapper[4699]: I1122 04:24:56.916608 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14d98ff4-07de-4764-a1a6-238316e83ee3-combined-ca-bundle\") pod \"ovn-controller-metrics-x9j67\" (UID: \"14d98ff4-07de-4764-a1a6-238316e83ee3\") " pod="openstack/ovn-controller-metrics-x9j67" Nov 22 04:24:56 crc kubenswrapper[4699]: I1122 04:24:56.916802 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnsl4\" (UniqueName: \"kubernetes.io/projected/14d98ff4-07de-4764-a1a6-238316e83ee3-kube-api-access-nnsl4\") pod \"ovn-controller-metrics-x9j67\" (UID: \"14d98ff4-07de-4764-a1a6-238316e83ee3\") " pod="openstack/ovn-controller-metrics-x9j67" Nov 22 04:24:56 crc kubenswrapper[4699]: I1122 04:24:56.916964 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/14d98ff4-07de-4764-a1a6-238316e83ee3-ovn-rundir\") pod \"ovn-controller-metrics-x9j67\" (UID: \"14d98ff4-07de-4764-a1a6-238316e83ee3\") " pod="openstack/ovn-controller-metrics-x9j67" Nov 22 04:24:56 crc kubenswrapper[4699]: I1122 04:24:56.917097 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/14d98ff4-07de-4764-a1a6-238316e83ee3-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-x9j67\" (UID: \"14d98ff4-07de-4764-a1a6-238316e83ee3\") " pod="openstack/ovn-controller-metrics-x9j67" Nov 22 04:24:56 crc kubenswrapper[4699]: I1122 04:24:56.917321 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14d98ff4-07de-4764-a1a6-238316e83ee3-config\") pod \"ovn-controller-metrics-x9j67\" (UID: \"14d98ff4-07de-4764-a1a6-238316e83ee3\") " pod="openstack/ovn-controller-metrics-x9j67" Nov 22 04:24:56 crc kubenswrapper[4699]: I1122 04:24:56.917364 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/14d98ff4-07de-4764-a1a6-238316e83ee3-ovs-rundir\") pod \"ovn-controller-metrics-x9j67\" (UID: \"14d98ff4-07de-4764-a1a6-238316e83ee3\") " pod="openstack/ovn-controller-metrics-x9j67" Nov 22 04:24:56 crc kubenswrapper[4699]: I1122 04:24:56.945350 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-27dtv"] Nov 22 04:24:56 crc kubenswrapper[4699]: I1122 04:24:56.973192 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-ztht2"] Nov 22 04:24:56 crc kubenswrapper[4699]: I1122 04:24:56.975756 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-ztht2" Nov 22 04:24:56 crc kubenswrapper[4699]: I1122 04:24:56.978468 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Nov 22 04:24:56 crc kubenswrapper[4699]: I1122 04:24:56.997527 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-ztht2"] Nov 22 04:24:57 crc kubenswrapper[4699]: I1122 04:24:57.019172 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1137043-00d5-43bd-a4e1-6cdc2c17fb88-config\") pod \"dnsmasq-dns-7fd796d7df-ztht2\" (UID: \"c1137043-00d5-43bd-a4e1-6cdc2c17fb88\") " pod="openstack/dnsmasq-dns-7fd796d7df-ztht2" Nov 22 04:24:57 crc kubenswrapper[4699]: I1122 04:24:57.019235 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/14d98ff4-07de-4764-a1a6-238316e83ee3-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-x9j67\" (UID: \"14d98ff4-07de-4764-a1a6-238316e83ee3\") " pod="openstack/ovn-controller-metrics-x9j67" Nov 22 04:24:57 crc kubenswrapper[4699]: I1122 04:24:57.019266 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s5jb\" (UniqueName: \"kubernetes.io/projected/c1137043-00d5-43bd-a4e1-6cdc2c17fb88-kube-api-access-8s5jb\") pod \"dnsmasq-dns-7fd796d7df-ztht2\" (UID: \"c1137043-00d5-43bd-a4e1-6cdc2c17fb88\") " pod="openstack/dnsmasq-dns-7fd796d7df-ztht2" Nov 22 04:24:57 crc kubenswrapper[4699]: I1122 04:24:57.019353 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14d98ff4-07de-4764-a1a6-238316e83ee3-config\") pod \"ovn-controller-metrics-x9j67\" (UID: \"14d98ff4-07de-4764-a1a6-238316e83ee3\") " pod="openstack/ovn-controller-metrics-x9j67" Nov 22 04:24:57 crc kubenswrapper[4699]: I1122 04:24:57.019381 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/14d98ff4-07de-4764-a1a6-238316e83ee3-ovs-rundir\") pod \"ovn-controller-metrics-x9j67\" (UID: \"14d98ff4-07de-4764-a1a6-238316e83ee3\") " pod="openstack/ovn-controller-metrics-x9j67" Nov 22 04:24:57 crc kubenswrapper[4699]: I1122 04:24:57.019416 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14d98ff4-07de-4764-a1a6-238316e83ee3-combined-ca-bundle\") pod \"ovn-controller-metrics-x9j67\" (UID: \"14d98ff4-07de-4764-a1a6-238316e83ee3\") " pod="openstack/ovn-controller-metrics-x9j67" Nov 22 04:24:57 crc kubenswrapper[4699]: I1122 04:24:57.019460 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnsl4\" (UniqueName: \"kubernetes.io/projected/14d98ff4-07de-4764-a1a6-238316e83ee3-kube-api-access-nnsl4\") pod \"ovn-controller-metrics-x9j67\" (UID: \"14d98ff4-07de-4764-a1a6-238316e83ee3\") " pod="openstack/ovn-controller-metrics-x9j67" Nov 22 04:24:57 crc kubenswrapper[4699]: I1122 04:24:57.019484 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c1137043-00d5-43bd-a4e1-6cdc2c17fb88-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-ztht2\" (UID: \"c1137043-00d5-43bd-a4e1-6cdc2c17fb88\") " pod="openstack/dnsmasq-dns-7fd796d7df-ztht2" Nov 22 04:24:57 crc kubenswrapper[4699]: I1122 04:24:57.019505 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1137043-00d5-43bd-a4e1-6cdc2c17fb88-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-ztht2\" (UID: \"c1137043-00d5-43bd-a4e1-6cdc2c17fb88\") " pod="openstack/dnsmasq-dns-7fd796d7df-ztht2" Nov 22 04:24:57 crc kubenswrapper[4699]: I1122 04:24:57.019723 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/14d98ff4-07de-4764-a1a6-238316e83ee3-ovn-rundir\") pod \"ovn-controller-metrics-x9j67\" (UID: \"14d98ff4-07de-4764-a1a6-238316e83ee3\") " pod="openstack/ovn-controller-metrics-x9j67" Nov 22 04:24:57 crc kubenswrapper[4699]: I1122 04:24:57.020105 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/14d98ff4-07de-4764-a1a6-238316e83ee3-ovn-rundir\") pod \"ovn-controller-metrics-x9j67\" (UID: \"14d98ff4-07de-4764-a1a6-238316e83ee3\") " pod="openstack/ovn-controller-metrics-x9j67" Nov 22 04:24:57 crc kubenswrapper[4699]: I1122 04:24:57.020106 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/14d98ff4-07de-4764-a1a6-238316e83ee3-ovs-rundir\") pod \"ovn-controller-metrics-x9j67\" (UID: \"14d98ff4-07de-4764-a1a6-238316e83ee3\") " pod="openstack/ovn-controller-metrics-x9j67" Nov 22 04:24:57 crc kubenswrapper[4699]: I1122 04:24:57.020852 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14d98ff4-07de-4764-a1a6-238316e83ee3-config\") pod \"ovn-controller-metrics-x9j67\" (UID: \"14d98ff4-07de-4764-a1a6-238316e83ee3\") " pod="openstack/ovn-controller-metrics-x9j67" Nov 22 04:24:57 crc kubenswrapper[4699]: I1122 04:24:57.030998 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14d98ff4-07de-4764-a1a6-238316e83ee3-combined-ca-bundle\") pod \"ovn-controller-metrics-x9j67\" (UID: \"14d98ff4-07de-4764-a1a6-238316e83ee3\") " pod="openstack/ovn-controller-metrics-x9j67" Nov 22 04:24:57 crc kubenswrapper[4699]: I1122 04:24:57.032026 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/14d98ff4-07de-4764-a1a6-238316e83ee3-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-x9j67\" (UID: \"14d98ff4-07de-4764-a1a6-238316e83ee3\") " pod="openstack/ovn-controller-metrics-x9j67" Nov 22 04:24:57 crc kubenswrapper[4699]: I1122 04:24:57.070545 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnsl4\" (UniqueName: \"kubernetes.io/projected/14d98ff4-07de-4764-a1a6-238316e83ee3-kube-api-access-nnsl4\") pod \"ovn-controller-metrics-x9j67\" (UID: \"14d98ff4-07de-4764-a1a6-238316e83ee3\") " pod="openstack/ovn-controller-metrics-x9j67" Nov 22 04:24:57 crc kubenswrapper[4699]: I1122 04:24:57.121560 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c1137043-00d5-43bd-a4e1-6cdc2c17fb88-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-ztht2\" (UID: \"c1137043-00d5-43bd-a4e1-6cdc2c17fb88\") " pod="openstack/dnsmasq-dns-7fd796d7df-ztht2" Nov 22 04:24:57 crc kubenswrapper[4699]: I1122 04:24:57.122075 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1137043-00d5-43bd-a4e1-6cdc2c17fb88-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-ztht2\" (UID: \"c1137043-00d5-43bd-a4e1-6cdc2c17fb88\") " pod="openstack/dnsmasq-dns-7fd796d7df-ztht2" Nov 22 04:24:57 crc kubenswrapper[4699]: I1122 04:24:57.122154 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1137043-00d5-43bd-a4e1-6cdc2c17fb88-config\") pod \"dnsmasq-dns-7fd796d7df-ztht2\" (UID: \"c1137043-00d5-43bd-a4e1-6cdc2c17fb88\") " pod="openstack/dnsmasq-dns-7fd796d7df-ztht2" Nov 22 04:24:57 crc kubenswrapper[4699]: I1122 04:24:57.122200 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8s5jb\" (UniqueName: \"kubernetes.io/projected/c1137043-00d5-43bd-a4e1-6cdc2c17fb88-kube-api-access-8s5jb\") pod \"dnsmasq-dns-7fd796d7df-ztht2\" (UID: \"c1137043-00d5-43bd-a4e1-6cdc2c17fb88\") " pod="openstack/dnsmasq-dns-7fd796d7df-ztht2" Nov 22 04:24:57 crc kubenswrapper[4699]: I1122 04:24:57.123179 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c1137043-00d5-43bd-a4e1-6cdc2c17fb88-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-ztht2\" (UID: \"c1137043-00d5-43bd-a4e1-6cdc2c17fb88\") " pod="openstack/dnsmasq-dns-7fd796d7df-ztht2" Nov 22 04:24:57 crc kubenswrapper[4699]: I1122 04:24:57.123503 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1137043-00d5-43bd-a4e1-6cdc2c17fb88-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-ztht2\" (UID: \"c1137043-00d5-43bd-a4e1-6cdc2c17fb88\") " pod="openstack/dnsmasq-dns-7fd796d7df-ztht2" Nov 22 04:24:57 crc kubenswrapper[4699]: I1122 04:24:57.123866 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1137043-00d5-43bd-a4e1-6cdc2c17fb88-config\") pod \"dnsmasq-dns-7fd796d7df-ztht2\" (UID: \"c1137043-00d5-43bd-a4e1-6cdc2c17fb88\") " pod="openstack/dnsmasq-dns-7fd796d7df-ztht2" Nov 22 04:24:57 crc kubenswrapper[4699]: I1122 04:24:57.136882 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-x9j67" Nov 22 04:24:57 crc kubenswrapper[4699]: I1122 04:24:57.144340 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8tbxw"] Nov 22 04:24:57 crc kubenswrapper[4699]: I1122 04:24:57.163390 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s5jb\" (UniqueName: \"kubernetes.io/projected/c1137043-00d5-43bd-a4e1-6cdc2c17fb88-kube-api-access-8s5jb\") pod \"dnsmasq-dns-7fd796d7df-ztht2\" (UID: \"c1137043-00d5-43bd-a4e1-6cdc2c17fb88\") " pod="openstack/dnsmasq-dns-7fd796d7df-ztht2" Nov 22 04:24:57 crc kubenswrapper[4699]: I1122 04:24:57.196497 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-2glrx"] Nov 22 04:24:57 crc kubenswrapper[4699]: I1122 04:24:57.197958 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-2glrx" Nov 22 04:24:57 crc kubenswrapper[4699]: I1122 04:24:57.202365 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Nov 22 04:24:57 crc kubenswrapper[4699]: I1122 04:24:57.229961 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0afac114-a756-478a-a7d0-ec9952944484-config\") pod \"dnsmasq-dns-86db49b7ff-2glrx\" (UID: \"0afac114-a756-478a-a7d0-ec9952944484\") " pod="openstack/dnsmasq-dns-86db49b7ff-2glrx" Nov 22 04:24:57 crc kubenswrapper[4699]: I1122 04:24:57.230026 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0afac114-a756-478a-a7d0-ec9952944484-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-2glrx\" (UID: \"0afac114-a756-478a-a7d0-ec9952944484\") " pod="openstack/dnsmasq-dns-86db49b7ff-2glrx" Nov 22 04:24:57 crc kubenswrapper[4699]: I1122 04:24:57.230087 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwlkq\" (UniqueName: \"kubernetes.io/projected/0afac114-a756-478a-a7d0-ec9952944484-kube-api-access-wwlkq\") pod \"dnsmasq-dns-86db49b7ff-2glrx\" (UID: \"0afac114-a756-478a-a7d0-ec9952944484\") " pod="openstack/dnsmasq-dns-86db49b7ff-2glrx" Nov 22 04:24:57 crc kubenswrapper[4699]: I1122 04:24:57.230113 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0afac114-a756-478a-a7d0-ec9952944484-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-2glrx\" (UID: \"0afac114-a756-478a-a7d0-ec9952944484\") " pod="openstack/dnsmasq-dns-86db49b7ff-2glrx" Nov 22 04:24:57 crc kubenswrapper[4699]: I1122 04:24:57.230190 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0afac114-a756-478a-a7d0-ec9952944484-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-2glrx\" (UID: \"0afac114-a756-478a-a7d0-ec9952944484\") " pod="openstack/dnsmasq-dns-86db49b7ff-2glrx" Nov 22 04:24:57 crc kubenswrapper[4699]: I1122 04:24:57.235086 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-2glrx"] Nov 22 04:24:57 crc kubenswrapper[4699]: I1122 04:24:57.304158 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-ztht2" Nov 22 04:24:57 crc kubenswrapper[4699]: I1122 04:24:57.331697 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0afac114-a756-478a-a7d0-ec9952944484-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-2glrx\" (UID: \"0afac114-a756-478a-a7d0-ec9952944484\") " pod="openstack/dnsmasq-dns-86db49b7ff-2glrx" Nov 22 04:24:57 crc kubenswrapper[4699]: I1122 04:24:57.332811 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0afac114-a756-478a-a7d0-ec9952944484-config\") pod \"dnsmasq-dns-86db49b7ff-2glrx\" (UID: \"0afac114-a756-478a-a7d0-ec9952944484\") " pod="openstack/dnsmasq-dns-86db49b7ff-2glrx" Nov 22 04:24:57 crc kubenswrapper[4699]: I1122 04:24:57.332840 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0afac114-a756-478a-a7d0-ec9952944484-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-2glrx\" (UID: \"0afac114-a756-478a-a7d0-ec9952944484\") " pod="openstack/dnsmasq-dns-86db49b7ff-2glrx" Nov 22 04:24:57 crc kubenswrapper[4699]: I1122 04:24:57.332910 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0afac114-a756-478a-a7d0-ec9952944484-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-2glrx\" (UID: \"0afac114-a756-478a-a7d0-ec9952944484\") " pod="openstack/dnsmasq-dns-86db49b7ff-2glrx" Nov 22 04:24:57 crc kubenswrapper[4699]: I1122 04:24:57.332928 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwlkq\" (UniqueName: \"kubernetes.io/projected/0afac114-a756-478a-a7d0-ec9952944484-kube-api-access-wwlkq\") pod \"dnsmasq-dns-86db49b7ff-2glrx\" (UID: \"0afac114-a756-478a-a7d0-ec9952944484\") " pod="openstack/dnsmasq-dns-86db49b7ff-2glrx" Nov 22 04:24:57 crc kubenswrapper[4699]: I1122 04:24:57.333682 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0afac114-a756-478a-a7d0-ec9952944484-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-2glrx\" (UID: \"0afac114-a756-478a-a7d0-ec9952944484\") " pod="openstack/dnsmasq-dns-86db49b7ff-2glrx" Nov 22 04:24:57 crc kubenswrapper[4699]: I1122 04:24:57.334224 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0afac114-a756-478a-a7d0-ec9952944484-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-2glrx\" (UID: \"0afac114-a756-478a-a7d0-ec9952944484\") " pod="openstack/dnsmasq-dns-86db49b7ff-2glrx" Nov 22 04:24:57 crc kubenswrapper[4699]: I1122 04:24:57.334811 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0afac114-a756-478a-a7d0-ec9952944484-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-2glrx\" (UID: \"0afac114-a756-478a-a7d0-ec9952944484\") " pod="openstack/dnsmasq-dns-86db49b7ff-2glrx" Nov 22 04:24:57 crc kubenswrapper[4699]: I1122 04:24:57.334987 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0afac114-a756-478a-a7d0-ec9952944484-config\") pod \"dnsmasq-dns-86db49b7ff-2glrx\" (UID: \"0afac114-a756-478a-a7d0-ec9952944484\") " pod="openstack/dnsmasq-dns-86db49b7ff-2glrx" Nov 22 04:24:57 crc kubenswrapper[4699]: I1122 04:24:57.353088 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwlkq\" (UniqueName: \"kubernetes.io/projected/0afac114-a756-478a-a7d0-ec9952944484-kube-api-access-wwlkq\") pod \"dnsmasq-dns-86db49b7ff-2glrx\" (UID: \"0afac114-a756-478a-a7d0-ec9952944484\") " pod="openstack/dnsmasq-dns-86db49b7ff-2glrx" Nov 22 04:24:57 crc kubenswrapper[4699]: I1122 04:24:57.530519 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-2glrx" Nov 22 04:24:59 crc kubenswrapper[4699]: I1122 04:24:59.344967 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-8tbxw" Nov 22 04:24:59 crc kubenswrapper[4699]: I1122 04:24:59.369731 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndmsc\" (UniqueName: \"kubernetes.io/projected/e788c26f-418d-49bc-901f-b529753f6fde-kube-api-access-ndmsc\") pod \"e788c26f-418d-49bc-901f-b529753f6fde\" (UID: \"e788c26f-418d-49bc-901f-b529753f6fde\") " Nov 22 04:24:59 crc kubenswrapper[4699]: I1122 04:24:59.369815 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e788c26f-418d-49bc-901f-b529753f6fde-dns-svc\") pod \"e788c26f-418d-49bc-901f-b529753f6fde\" (UID: \"e788c26f-418d-49bc-901f-b529753f6fde\") " Nov 22 04:24:59 crc kubenswrapper[4699]: I1122 04:24:59.369953 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e788c26f-418d-49bc-901f-b529753f6fde-config\") pod \"e788c26f-418d-49bc-901f-b529753f6fde\" (UID: \"e788c26f-418d-49bc-901f-b529753f6fde\") " Nov 22 04:24:59 crc kubenswrapper[4699]: I1122 04:24:59.370389 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e788c26f-418d-49bc-901f-b529753f6fde-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e788c26f-418d-49bc-901f-b529753f6fde" (UID: "e788c26f-418d-49bc-901f-b529753f6fde"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:24:59 crc kubenswrapper[4699]: I1122 04:24:59.370398 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e788c26f-418d-49bc-901f-b529753f6fde-config" (OuterVolumeSpecName: "config") pod "e788c26f-418d-49bc-901f-b529753f6fde" (UID: "e788c26f-418d-49bc-901f-b529753f6fde"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:24:59 crc kubenswrapper[4699]: I1122 04:24:59.370573 4699 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e788c26f-418d-49bc-901f-b529753f6fde-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 04:24:59 crc kubenswrapper[4699]: I1122 04:24:59.370597 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e788c26f-418d-49bc-901f-b529753f6fde-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:24:59 crc kubenswrapper[4699]: I1122 04:24:59.375216 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e788c26f-418d-49bc-901f-b529753f6fde-kube-api-access-ndmsc" (OuterVolumeSpecName: "kube-api-access-ndmsc") pod "e788c26f-418d-49bc-901f-b529753f6fde" (UID: "e788c26f-418d-49bc-901f-b529753f6fde"). InnerVolumeSpecName "kube-api-access-ndmsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:24:59 crc kubenswrapper[4699]: I1122 04:24:59.472518 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndmsc\" (UniqueName: \"kubernetes.io/projected/e788c26f-418d-49bc-901f-b529753f6fde-kube-api-access-ndmsc\") on node \"crc\" DevicePath \"\"" Nov 22 04:24:59 crc kubenswrapper[4699]: I1122 04:24:59.843082 4699 generic.go:334] "Generic (PLEG): container finished" podID="57084326-d72e-40cb-9905-ca75d50f51e3" containerID="099bf51e438cc0285fab82f0549e5154b84c62429924ff8376a0149a87b1873c" exitCode=0 Nov 22 04:24:59 crc kubenswrapper[4699]: I1122 04:24:59.843268 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"57084326-d72e-40cb-9905-ca75d50f51e3","Type":"ContainerDied","Data":"099bf51e438cc0285fab82f0549e5154b84c62429924ff8376a0149a87b1873c"} Nov 22 04:24:59 crc kubenswrapper[4699]: I1122 04:24:59.844490 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-8tbxw" event={"ID":"e788c26f-418d-49bc-901f-b529753f6fde","Type":"ContainerDied","Data":"e1a2d6952478443eeb997d5e5c86234f078a32936d60c4c874d7a98b805bfeac"} Nov 22 04:24:59 crc kubenswrapper[4699]: I1122 04:24:59.844552 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-8tbxw" Nov 22 04:24:59 crc kubenswrapper[4699]: I1122 04:24:59.904276 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8tbxw"] Nov 22 04:24:59 crc kubenswrapper[4699]: I1122 04:24:59.935164 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8tbxw"] Nov 22 04:24:59 crc kubenswrapper[4699]: I1122 04:24:59.963925 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-27dtv" Nov 22 04:25:00 crc kubenswrapper[4699]: I1122 04:25:00.084098 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kn48z\" (UniqueName: \"kubernetes.io/projected/eada2686-6d38-4401-b287-0fef18eb37f4-kube-api-access-kn48z\") pod \"eada2686-6d38-4401-b287-0fef18eb37f4\" (UID: \"eada2686-6d38-4401-b287-0fef18eb37f4\") " Nov 22 04:25:00 crc kubenswrapper[4699]: I1122 04:25:00.084224 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eada2686-6d38-4401-b287-0fef18eb37f4-config\") pod \"eada2686-6d38-4401-b287-0fef18eb37f4\" (UID: \"eada2686-6d38-4401-b287-0fef18eb37f4\") " Nov 22 04:25:00 crc kubenswrapper[4699]: I1122 04:25:00.084337 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eada2686-6d38-4401-b287-0fef18eb37f4-dns-svc\") pod \"eada2686-6d38-4401-b287-0fef18eb37f4\" (UID: \"eada2686-6d38-4401-b287-0fef18eb37f4\") " Nov 22 04:25:00 crc kubenswrapper[4699]: I1122 04:25:00.085142 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eada2686-6d38-4401-b287-0fef18eb37f4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "eada2686-6d38-4401-b287-0fef18eb37f4" (UID: "eada2686-6d38-4401-b287-0fef18eb37f4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:25:00 crc kubenswrapper[4699]: I1122 04:25:00.085262 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eada2686-6d38-4401-b287-0fef18eb37f4-config" (OuterVolumeSpecName: "config") pod "eada2686-6d38-4401-b287-0fef18eb37f4" (UID: "eada2686-6d38-4401-b287-0fef18eb37f4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:25:00 crc kubenswrapper[4699]: I1122 04:25:00.088243 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eada2686-6d38-4401-b287-0fef18eb37f4-kube-api-access-kn48z" (OuterVolumeSpecName: "kube-api-access-kn48z") pod "eada2686-6d38-4401-b287-0fef18eb37f4" (UID: "eada2686-6d38-4401-b287-0fef18eb37f4"). InnerVolumeSpecName "kube-api-access-kn48z". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:25:00 crc kubenswrapper[4699]: I1122 04:25:00.192840 4699 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eada2686-6d38-4401-b287-0fef18eb37f4-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:00 crc kubenswrapper[4699]: I1122 04:25:00.192908 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kn48z\" (UniqueName: \"kubernetes.io/projected/eada2686-6d38-4401-b287-0fef18eb37f4-kube-api-access-kn48z\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:00 crc kubenswrapper[4699]: I1122 04:25:00.192932 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eada2686-6d38-4401-b287-0fef18eb37f4-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:00 crc kubenswrapper[4699]: I1122 04:25:00.854981 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-27dtv" event={"ID":"eada2686-6d38-4401-b287-0fef18eb37f4","Type":"ContainerDied","Data":"0680cdfeb40a7802c30151d87252ec1f9fed36d2dbf3c9e481d514052119e496"} Nov 22 04:25:00 crc kubenswrapper[4699]: I1122 04:25:00.855015 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-27dtv" Nov 22 04:25:00 crc kubenswrapper[4699]: I1122 04:25:00.944245 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-27dtv"] Nov 22 04:25:00 crc kubenswrapper[4699]: I1122 04:25:00.951130 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-27dtv"] Nov 22 04:25:01 crc kubenswrapper[4699]: I1122 04:25:01.457511 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e788c26f-418d-49bc-901f-b529753f6fde" path="/var/lib/kubelet/pods/e788c26f-418d-49bc-901f-b529753f6fde/volumes" Nov 22 04:25:01 crc kubenswrapper[4699]: I1122 04:25:01.457888 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eada2686-6d38-4401-b287-0fef18eb37f4" path="/var/lib/kubelet/pods/eada2686-6d38-4401-b287-0fef18eb37f4/volumes" Nov 22 04:25:01 crc kubenswrapper[4699]: I1122 04:25:01.501696 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-2glrx"] Nov 22 04:25:01 crc kubenswrapper[4699]: W1122 04:25:01.590986 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0afac114_a756_478a_a7d0_ec9952944484.slice/crio-546393b3187cfcae29508620bb9ccef1132e3dd9826bcc477ae9f8f8dd727e66 WatchSource:0}: Error finding container 546393b3187cfcae29508620bb9ccef1132e3dd9826bcc477ae9f8f8dd727e66: Status 404 returned error can't find the container with id 546393b3187cfcae29508620bb9ccef1132e3dd9826bcc477ae9f8f8dd727e66 Nov 22 04:25:01 crc kubenswrapper[4699]: I1122 04:25:01.865112 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-2glrx" event={"ID":"0afac114-a756-478a-a7d0-ec9952944484","Type":"ContainerStarted","Data":"546393b3187cfcae29508620bb9ccef1132e3dd9826bcc477ae9f8f8dd727e66"} Nov 22 04:25:02 crc kubenswrapper[4699]: I1122 04:25:02.001736 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-ztht2"] Nov 22 04:25:02 crc kubenswrapper[4699]: I1122 04:25:02.037903 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-x9j67"] Nov 22 04:25:02 crc kubenswrapper[4699]: W1122 04:25:02.352493 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14d98ff4_07de_4764_a1a6_238316e83ee3.slice/crio-065c3873e61c0f44c38878b804fd6094173ec22d6e57d322aa58e1c244910636 WatchSource:0}: Error finding container 065c3873e61c0f44c38878b804fd6094173ec22d6e57d322aa58e1c244910636: Status 404 returned error can't find the container with id 065c3873e61c0f44c38878b804fd6094173ec22d6e57d322aa58e1c244910636 Nov 22 04:25:02 crc kubenswrapper[4699]: I1122 04:25:02.876423 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-x9j67" event={"ID":"14d98ff4-07de-4764-a1a6-238316e83ee3","Type":"ContainerStarted","Data":"065c3873e61c0f44c38878b804fd6094173ec22d6e57d322aa58e1c244910636"} Nov 22 04:25:02 crc kubenswrapper[4699]: I1122 04:25:02.878646 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-ztht2" event={"ID":"c1137043-00d5-43bd-a4e1-6cdc2c17fb88","Type":"ContainerStarted","Data":"fe5795743758b2b5bf651eef0434d985a51b20ba6063c88d90bdfd53bc00c58e"} Nov 22 04:25:04 crc kubenswrapper[4699]: E1122 04:25:04.861109 4699 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Nov 22 04:25:04 crc kubenswrapper[4699]: E1122 04:25:04.861888 4699 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Nov 22 04:25:04 crc kubenswrapper[4699]: E1122 04:25:04.862023 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kqfmk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(c970cf2e-16a0-42fe-ba32-ee217bd82db8): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 22 04:25:04 crc kubenswrapper[4699]: E1122 04:25:04.863542 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="c970cf2e-16a0-42fe-ba32-ee217bd82db8" Nov 22 04:25:04 crc kubenswrapper[4699]: I1122 04:25:04.895974 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s7mlz" event={"ID":"0311366c-c8c7-449c-b617-213a4d87de00","Type":"ContainerStarted","Data":"599f6a7b9a5cd878eb091ea83df9503c89eb7a07fe1899ccba55ae728c53a1fc"} Nov 22 04:25:04 crc kubenswrapper[4699]: I1122 04:25:04.898737 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-j7b96" event={"ID":"55053527-f2d2-4e44-8a9c-153b74ef3605","Type":"ContainerStarted","Data":"6641942a5ee354c45c6b3eee5f25d4ede5af59e03c744cab527856713544821b"} Nov 22 04:25:04 crc kubenswrapper[4699]: I1122 04:25:04.903221 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"57084326-d72e-40cb-9905-ca75d50f51e3","Type":"ContainerStarted","Data":"2efcff1f32e13c18df5e7f6b8bbaf51ef318f81370944d99bd063b7030db61ed"} Nov 22 04:25:04 crc kubenswrapper[4699]: I1122 04:25:04.906108 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa","Type":"ContainerStarted","Data":"5caf308d8c8db916b0fe79fd1f9c62ed9e40139306d17d39ea3e3c00e35da6a9"} Nov 22 04:25:04 crc kubenswrapper[4699]: I1122 04:25:04.915361 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e74585bc-d1cf-473d-95ca-12c816ff0020","Type":"ContainerStarted","Data":"b035d516fae4eb150a245b83df03ec1139eaa35505122491d4168a0b7204efb8"} Nov 22 04:25:04 crc kubenswrapper[4699]: I1122 04:25:04.917684 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"3e31d684-0292-4e13-8bce-9af3fbcb09cb","Type":"ContainerStarted","Data":"491f9cff98f35836abf1a909c920606b1b29ee0e6f0675edf9e7b2ed2e176bf2"} Nov 22 04:25:04 crc kubenswrapper[4699]: E1122 04:25:04.919404 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="c970cf2e-16a0-42fe-ba32-ee217bd82db8" Nov 22 04:25:04 crc kubenswrapper[4699]: I1122 04:25:04.930757 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=15.495565812 podStartE2EDuration="1m1.930734407s" podCreationTimestamp="2025-11-22 04:24:03 +0000 UTC" firstStartedPulling="2025-11-22 04:24:06.093792582 +0000 UTC m=+997.436413769" lastFinishedPulling="2025-11-22 04:24:52.528961177 +0000 UTC m=+1043.871582364" observedRunningTime="2025-11-22 04:25:04.930144173 +0000 UTC m=+1056.272765410" watchObservedRunningTime="2025-11-22 04:25:04.930734407 +0000 UTC m=+1056.273355604" Nov 22 04:25:05 crc kubenswrapper[4699]: I1122 04:25:05.275329 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Nov 22 04:25:05 crc kubenswrapper[4699]: I1122 04:25:05.275397 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Nov 22 04:25:05 crc kubenswrapper[4699]: I1122 04:25:05.977591 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-s7mlz" podStartSLOduration=44.577164423 podStartE2EDuration="53.977545215s" podCreationTimestamp="2025-11-22 04:24:12 +0000 UTC" firstStartedPulling="2025-11-22 04:24:53.204071294 +0000 UTC m=+1044.546692481" lastFinishedPulling="2025-11-22 04:25:02.604452086 +0000 UTC m=+1053.947073273" observedRunningTime="2025-11-22 04:25:05.968303962 +0000 UTC m=+1057.310925159" watchObservedRunningTime="2025-11-22 04:25:05.977545215 +0000 UTC m=+1057.320166422" Nov 22 04:25:08 crc kubenswrapper[4699]: I1122 04:25:08.253599 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-s7mlz" Nov 22 04:25:08 crc kubenswrapper[4699]: I1122 04:25:08.725634 4699 patch_prober.go:28] interesting pod/machine-config-daemon-kjwnt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:25:08 crc kubenswrapper[4699]: I1122 04:25:08.725684 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:25:08 crc kubenswrapper[4699]: I1122 04:25:08.725719 4699 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" Nov 22 04:25:08 crc kubenswrapper[4699]: I1122 04:25:08.726330 4699 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"860a4fc3095846d1c30f6bfb9c79f3b411c14f316e6ed54ad090c3a0186b2e5c"} pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 04:25:08 crc kubenswrapper[4699]: I1122 04:25:08.726386 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerName="machine-config-daemon" containerID="cri-o://860a4fc3095846d1c30f6bfb9c79f3b411c14f316e6ed54ad090c3a0186b2e5c" gracePeriod=600 Nov 22 04:25:12 crc kubenswrapper[4699]: I1122 04:25:12.017631 4699 generic.go:334] "Generic (PLEG): container finished" podID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerID="860a4fc3095846d1c30f6bfb9c79f3b411c14f316e6ed54ad090c3a0186b2e5c" exitCode=0 Nov 22 04:25:12 crc kubenswrapper[4699]: I1122 04:25:12.017737 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" event={"ID":"41bdbae2-706a-4f84-9f56-5a42aec77762","Type":"ContainerDied","Data":"860a4fc3095846d1c30f6bfb9c79f3b411c14f316e6ed54ad090c3a0186b2e5c"} Nov 22 04:25:12 crc kubenswrapper[4699]: I1122 04:25:12.017979 4699 scope.go:117] "RemoveContainer" containerID="199b50f4c2609410414bdb3fb89b173b5d648f7f42f86d10fd711b75ac95c283" Nov 22 04:25:14 crc kubenswrapper[4699]: I1122 04:25:14.034095 4699 generic.go:334] "Generic (PLEG): container finished" podID="55053527-f2d2-4e44-8a9c-153b74ef3605" containerID="6641942a5ee354c45c6b3eee5f25d4ede5af59e03c744cab527856713544821b" exitCode=0 Nov 22 04:25:14 crc kubenswrapper[4699]: I1122 04:25:14.034161 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-j7b96" event={"ID":"55053527-f2d2-4e44-8a9c-153b74ef3605","Type":"ContainerDied","Data":"6641942a5ee354c45c6b3eee5f25d4ede5af59e03c744cab527856713544821b"} Nov 22 04:25:19 crc kubenswrapper[4699]: E1122 04:25:19.762007 4699 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.136:37088->38.102.83.136:39863: read tcp 38.102.83.136:37088->38.102.83.136:39863: read: connection reset by peer Nov 22 04:25:21 crc kubenswrapper[4699]: I1122 04:25:21.098239 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" event={"ID":"41bdbae2-706a-4f84-9f56-5a42aec77762","Type":"ContainerStarted","Data":"6069541dbe3b036cc4c74183802ec26cdc4e0a14a8ff9d64a37a60b66cc8ee5b"} Nov 22 04:25:21 crc kubenswrapper[4699]: I1122 04:25:21.461202 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Nov 22 04:25:21 crc kubenswrapper[4699]: I1122 04:25:21.595227 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Nov 22 04:25:22 crc kubenswrapper[4699]: I1122 04:25:22.110415 4699 generic.go:334] "Generic (PLEG): container finished" podID="0afac114-a756-478a-a7d0-ec9952944484" containerID="a857da9d5b8dd7f78e00333cd79b2d92e649e91a901cbc758ea4a4505b2f846a" exitCode=0 Nov 22 04:25:22 crc kubenswrapper[4699]: I1122 04:25:22.110581 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-2glrx" event={"ID":"0afac114-a756-478a-a7d0-ec9952944484","Type":"ContainerDied","Data":"a857da9d5b8dd7f78e00333cd79b2d92e649e91a901cbc758ea4a4505b2f846a"} Nov 22 04:25:22 crc kubenswrapper[4699]: I1122 04:25:22.115012 4699 generic.go:334] "Generic (PLEG): container finished" podID="c1137043-00d5-43bd-a4e1-6cdc2c17fb88" containerID="5e65f71829f916ee81698a9e6f1b554bc1301d8c00dfbac757e93e6ec7f6943e" exitCode=0 Nov 22 04:25:22 crc kubenswrapper[4699]: I1122 04:25:22.115131 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-ztht2" event={"ID":"c1137043-00d5-43bd-a4e1-6cdc2c17fb88","Type":"ContainerDied","Data":"5e65f71829f916ee81698a9e6f1b554bc1301d8c00dfbac757e93e6ec7f6943e"} Nov 22 04:25:22 crc kubenswrapper[4699]: I1122 04:25:22.128411 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa","Type":"ContainerStarted","Data":"b3c72d80dc01962d9f00b67ce57d97237d8dcd51572b623beaad14f89d9acba0"} Nov 22 04:25:22 crc kubenswrapper[4699]: I1122 04:25:22.130802 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"43d42bf1-de55-49eb-990f-451ad31d0e21","Type":"ContainerStarted","Data":"54cce131e13a928cbbe825a4a23558e4febfb3b63b057f90b289d3f8f7b28d49"} Nov 22 04:25:22 crc kubenswrapper[4699]: I1122 04:25:22.173019 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-j7b96" event={"ID":"55053527-f2d2-4e44-8a9c-153b74ef3605","Type":"ContainerStarted","Data":"010caf5fc1b0781cff68ab3ec1fcd472acfca29b6e8474ea6a66c807f0fd8215"} Nov 22 04:25:22 crc kubenswrapper[4699]: I1122 04:25:22.196230 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-x9j67" event={"ID":"14d98ff4-07de-4764-a1a6-238316e83ee3","Type":"ContainerStarted","Data":"b712500833cb8fc9062802bf7089ccdba93a4fdd8a1b583fbf2e4e4520370bf2"} Nov 22 04:25:22 crc kubenswrapper[4699]: I1122 04:25:22.197986 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c970cf2e-16a0-42fe-ba32-ee217bd82db8","Type":"ContainerStarted","Data":"e06ad23a072d71a58d00055d7639c48a8adda994392923c8b2a5b9e564510a48"} Nov 22 04:25:22 crc kubenswrapper[4699]: I1122 04:25:22.198665 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Nov 22 04:25:22 crc kubenswrapper[4699]: I1122 04:25:22.200424 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"3e31d684-0292-4e13-8bce-9af3fbcb09cb","Type":"ContainerStarted","Data":"c97426b8fc3bd9689f2ffea499fffede8b0d3a04b1f2a43b71e6747bec15376a"} Nov 22 04:25:22 crc kubenswrapper[4699]: I1122 04:25:22.218594 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=42.727614391 podStartE2EDuration="1m10.218573886s" podCreationTimestamp="2025-11-22 04:24:12 +0000 UTC" firstStartedPulling="2025-11-22 04:24:53.309422802 +0000 UTC m=+1044.652043989" lastFinishedPulling="2025-11-22 04:25:20.800382287 +0000 UTC m=+1072.143003484" observedRunningTime="2025-11-22 04:25:22.186658834 +0000 UTC m=+1073.529280041" watchObservedRunningTime="2025-11-22 04:25:22.218573886 +0000 UTC m=+1073.561195073" Nov 22 04:25:22 crc kubenswrapper[4699]: I1122 04:25:22.229834 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"02e377d7-9e5a-45ec-9460-16af64ce3db5","Type":"ContainerStarted","Data":"4e9f1cba8d3bb0d12e3cd0306becf427f53d2796b8cf36d4d7f378df796a0608"} Nov 22 04:25:22 crc kubenswrapper[4699]: I1122 04:25:22.249990 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=46.493261572 podStartE2EDuration="1m14.249970765s" podCreationTimestamp="2025-11-22 04:24:08 +0000 UTC" firstStartedPulling="2025-11-22 04:24:53.192865043 +0000 UTC m=+1044.535486230" lastFinishedPulling="2025-11-22 04:25:20.949574236 +0000 UTC m=+1072.292195423" observedRunningTime="2025-11-22 04:25:22.244723919 +0000 UTC m=+1073.587345106" watchObservedRunningTime="2025-11-22 04:25:22.249970765 +0000 UTC m=+1073.592591952" Nov 22 04:25:22 crc kubenswrapper[4699]: I1122 04:25:22.271847 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-x9j67" podStartSLOduration=7.826944992 podStartE2EDuration="26.271827954s" podCreationTimestamp="2025-11-22 04:24:56 +0000 UTC" firstStartedPulling="2025-11-22 04:25:02.355497855 +0000 UTC m=+1053.698119082" lastFinishedPulling="2025-11-22 04:25:20.800380857 +0000 UTC m=+1072.143002044" observedRunningTime="2025-11-22 04:25:22.26503943 +0000 UTC m=+1073.607660637" watchObservedRunningTime="2025-11-22 04:25:22.271827954 +0000 UTC m=+1073.614449141" Nov 22 04:25:22 crc kubenswrapper[4699]: I1122 04:25:22.316636 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=42.673337525 podStartE2EDuration="1m7.316612817s" podCreationTimestamp="2025-11-22 04:24:15 +0000 UTC" firstStartedPulling="2025-11-22 04:24:56.114046164 +0000 UTC m=+1047.456667351" lastFinishedPulling="2025-11-22 04:25:20.757321456 +0000 UTC m=+1072.099942643" observedRunningTime="2025-11-22 04:25:22.30307047 +0000 UTC m=+1073.645691647" watchObservedRunningTime="2025-11-22 04:25:22.316612817 +0000 UTC m=+1073.659234004" Nov 22 04:25:22 crc kubenswrapper[4699]: I1122 04:25:22.338496 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=3.158899492 podStartE2EDuration="1m16.338330942s" podCreationTimestamp="2025-11-22 04:24:06 +0000 UTC" firstStartedPulling="2025-11-22 04:24:07.430962182 +0000 UTC m=+998.773583369" lastFinishedPulling="2025-11-22 04:25:20.610393632 +0000 UTC m=+1071.953014819" observedRunningTime="2025-11-22 04:25:22.328235898 +0000 UTC m=+1073.670857105" watchObservedRunningTime="2025-11-22 04:25:22.338330942 +0000 UTC m=+1073.680952129" Nov 22 04:25:22 crc kubenswrapper[4699]: I1122 04:25:22.416726 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Nov 22 04:25:22 crc kubenswrapper[4699]: I1122 04:25:22.465422 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Nov 22 04:25:23 crc kubenswrapper[4699]: I1122 04:25:23.135766 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Nov 22 04:25:23 crc kubenswrapper[4699]: I1122 04:25:23.173315 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Nov 22 04:25:23 crc kubenswrapper[4699]: I1122 04:25:23.239087 4699 generic.go:334] "Generic (PLEG): container finished" podID="e74585bc-d1cf-473d-95ca-12c816ff0020" containerID="b035d516fae4eb150a245b83df03ec1139eaa35505122491d4168a0b7204efb8" exitCode=0 Nov 22 04:25:23 crc kubenswrapper[4699]: I1122 04:25:23.239172 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e74585bc-d1cf-473d-95ca-12c816ff0020","Type":"ContainerDied","Data":"b035d516fae4eb150a245b83df03ec1139eaa35505122491d4168a0b7204efb8"} Nov 22 04:25:23 crc kubenswrapper[4699]: I1122 04:25:23.240746 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"964a7a4a-f709-43ea-85f2-93a8273d503d","Type":"ContainerStarted","Data":"922cc6338789229c693e736b86854e08f2e6f96b709403c471b4ab8e464ef1b4"} Nov 22 04:25:23 crc kubenswrapper[4699]: I1122 04:25:23.244622 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-j7b96" event={"ID":"55053527-f2d2-4e44-8a9c-153b74ef3605","Type":"ContainerStarted","Data":"fa6c9d0f725148ce13ffe0e583dbe70fb6e501cc795c2e55b2371c347bc62d33"} Nov 22 04:25:23 crc kubenswrapper[4699]: I1122 04:25:23.244906 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-j7b96" Nov 22 04:25:23 crc kubenswrapper[4699]: I1122 04:25:23.244934 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-j7b96" Nov 22 04:25:23 crc kubenswrapper[4699]: I1122 04:25:23.247723 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-2glrx" event={"ID":"0afac114-a756-478a-a7d0-ec9952944484","Type":"ContainerStarted","Data":"8021c686740304dde58b0fb0395d3e4f3626980a0d98c6f32214792c465711c6"} Nov 22 04:25:23 crc kubenswrapper[4699]: I1122 04:25:23.247836 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-2glrx" Nov 22 04:25:23 crc kubenswrapper[4699]: I1122 04:25:23.250753 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-ztht2" event={"ID":"c1137043-00d5-43bd-a4e1-6cdc2c17fb88","Type":"ContainerStarted","Data":"36b603b3480a668c4237d82d61ed45657b30f48df00cdbf4ba61b5431828ae34"} Nov 22 04:25:23 crc kubenswrapper[4699]: I1122 04:25:23.250785 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fd796d7df-ztht2" Nov 22 04:25:23 crc kubenswrapper[4699]: I1122 04:25:23.252025 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Nov 22 04:25:23 crc kubenswrapper[4699]: I1122 04:25:23.252048 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Nov 22 04:25:23 crc kubenswrapper[4699]: I1122 04:25:23.290756 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fd796d7df-ztht2" podStartSLOduration=8.917508058 podStartE2EDuration="27.290732967s" podCreationTimestamp="2025-11-22 04:24:56 +0000 UTC" firstStartedPulling="2025-11-22 04:25:02.343838353 +0000 UTC m=+1053.686459540" lastFinishedPulling="2025-11-22 04:25:20.717063262 +0000 UTC m=+1072.059684449" observedRunningTime="2025-11-22 04:25:23.287053598 +0000 UTC m=+1074.629674805" watchObservedRunningTime="2025-11-22 04:25:23.290732967 +0000 UTC m=+1074.633354174" Nov 22 04:25:23 crc kubenswrapper[4699]: I1122 04:25:23.312806 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Nov 22 04:25:23 crc kubenswrapper[4699]: I1122 04:25:23.314808 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-2glrx" podStartSLOduration=7.237989043 podStartE2EDuration="26.314788579s" podCreationTimestamp="2025-11-22 04:24:57 +0000 UTC" firstStartedPulling="2025-11-22 04:25:01.593894205 +0000 UTC m=+1052.936515392" lastFinishedPulling="2025-11-22 04:25:20.670693731 +0000 UTC m=+1072.013314928" observedRunningTime="2025-11-22 04:25:23.309792798 +0000 UTC m=+1074.652414005" watchObservedRunningTime="2025-11-22 04:25:23.314788579 +0000 UTC m=+1074.657409766" Nov 22 04:25:23 crc kubenswrapper[4699]: I1122 04:25:23.323531 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Nov 22 04:25:23 crc kubenswrapper[4699]: I1122 04:25:23.367386 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-j7b96" podStartSLOduration=64.524013602 podStartE2EDuration="1m11.36736865s" podCreationTimestamp="2025-11-22 04:24:12 +0000 UTC" firstStartedPulling="2025-11-22 04:24:54.078412251 +0000 UTC m=+1045.421033438" lastFinishedPulling="2025-11-22 04:25:00.921767299 +0000 UTC m=+1052.264388486" observedRunningTime="2025-11-22 04:25:23.344316663 +0000 UTC m=+1074.686937870" watchObservedRunningTime="2025-11-22 04:25:23.36736865 +0000 UTC m=+1074.709989837" Nov 22 04:25:23 crc kubenswrapper[4699]: I1122 04:25:23.624855 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Nov 22 04:25:23 crc kubenswrapper[4699]: I1122 04:25:23.629206 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 22 04:25:23 crc kubenswrapper[4699]: I1122 04:25:23.630907 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-hjplm" Nov 22 04:25:23 crc kubenswrapper[4699]: I1122 04:25:23.631131 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Nov 22 04:25:23 crc kubenswrapper[4699]: I1122 04:25:23.631270 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Nov 22 04:25:23 crc kubenswrapper[4699]: I1122 04:25:23.631582 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Nov 22 04:25:23 crc kubenswrapper[4699]: I1122 04:25:23.641060 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Nov 22 04:25:23 crc kubenswrapper[4699]: I1122 04:25:23.661801 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cgqm\" (UniqueName: \"kubernetes.io/projected/6022714c-eabe-49a9-b794-0b7a0097b816-kube-api-access-8cgqm\") pod \"ovn-northd-0\" (UID: \"6022714c-eabe-49a9-b794-0b7a0097b816\") " pod="openstack/ovn-northd-0" Nov 22 04:25:23 crc kubenswrapper[4699]: I1122 04:25:23.661911 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6022714c-eabe-49a9-b794-0b7a0097b816-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6022714c-eabe-49a9-b794-0b7a0097b816\") " pod="openstack/ovn-northd-0" Nov 22 04:25:23 crc kubenswrapper[4699]: I1122 04:25:23.661966 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6022714c-eabe-49a9-b794-0b7a0097b816-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6022714c-eabe-49a9-b794-0b7a0097b816\") " pod="openstack/ovn-northd-0" Nov 22 04:25:23 crc kubenswrapper[4699]: I1122 04:25:23.662087 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6022714c-eabe-49a9-b794-0b7a0097b816-scripts\") pod \"ovn-northd-0\" (UID: \"6022714c-eabe-49a9-b794-0b7a0097b816\") " pod="openstack/ovn-northd-0" Nov 22 04:25:23 crc kubenswrapper[4699]: I1122 04:25:23.662144 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6022714c-eabe-49a9-b794-0b7a0097b816-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6022714c-eabe-49a9-b794-0b7a0097b816\") " pod="openstack/ovn-northd-0" Nov 22 04:25:23 crc kubenswrapper[4699]: I1122 04:25:23.662186 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6022714c-eabe-49a9-b794-0b7a0097b816-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6022714c-eabe-49a9-b794-0b7a0097b816\") " pod="openstack/ovn-northd-0" Nov 22 04:25:23 crc kubenswrapper[4699]: I1122 04:25:23.662214 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6022714c-eabe-49a9-b794-0b7a0097b816-config\") pod \"ovn-northd-0\" (UID: \"6022714c-eabe-49a9-b794-0b7a0097b816\") " pod="openstack/ovn-northd-0" Nov 22 04:25:23 crc kubenswrapper[4699]: I1122 04:25:23.763239 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cgqm\" (UniqueName: \"kubernetes.io/projected/6022714c-eabe-49a9-b794-0b7a0097b816-kube-api-access-8cgqm\") pod \"ovn-northd-0\" (UID: \"6022714c-eabe-49a9-b794-0b7a0097b816\") " pod="openstack/ovn-northd-0" Nov 22 04:25:23 crc kubenswrapper[4699]: I1122 04:25:23.763310 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6022714c-eabe-49a9-b794-0b7a0097b816-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6022714c-eabe-49a9-b794-0b7a0097b816\") " pod="openstack/ovn-northd-0" Nov 22 04:25:23 crc kubenswrapper[4699]: I1122 04:25:23.763341 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6022714c-eabe-49a9-b794-0b7a0097b816-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6022714c-eabe-49a9-b794-0b7a0097b816\") " pod="openstack/ovn-northd-0" Nov 22 04:25:23 crc kubenswrapper[4699]: I1122 04:25:23.763385 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6022714c-eabe-49a9-b794-0b7a0097b816-scripts\") pod \"ovn-northd-0\" (UID: \"6022714c-eabe-49a9-b794-0b7a0097b816\") " pod="openstack/ovn-northd-0" Nov 22 04:25:23 crc kubenswrapper[4699]: I1122 04:25:23.763412 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6022714c-eabe-49a9-b794-0b7a0097b816-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6022714c-eabe-49a9-b794-0b7a0097b816\") " pod="openstack/ovn-northd-0" Nov 22 04:25:23 crc kubenswrapper[4699]: I1122 04:25:23.763425 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6022714c-eabe-49a9-b794-0b7a0097b816-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6022714c-eabe-49a9-b794-0b7a0097b816\") " pod="openstack/ovn-northd-0" Nov 22 04:25:23 crc kubenswrapper[4699]: I1122 04:25:23.763463 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6022714c-eabe-49a9-b794-0b7a0097b816-config\") pod \"ovn-northd-0\" (UID: \"6022714c-eabe-49a9-b794-0b7a0097b816\") " pod="openstack/ovn-northd-0" Nov 22 04:25:23 crc kubenswrapper[4699]: I1122 04:25:23.764402 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6022714c-eabe-49a9-b794-0b7a0097b816-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6022714c-eabe-49a9-b794-0b7a0097b816\") " pod="openstack/ovn-northd-0" Nov 22 04:25:23 crc kubenswrapper[4699]: I1122 04:25:23.764601 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6022714c-eabe-49a9-b794-0b7a0097b816-config\") pod \"ovn-northd-0\" (UID: \"6022714c-eabe-49a9-b794-0b7a0097b816\") " pod="openstack/ovn-northd-0" Nov 22 04:25:23 crc kubenswrapper[4699]: I1122 04:25:23.765022 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6022714c-eabe-49a9-b794-0b7a0097b816-scripts\") pod \"ovn-northd-0\" (UID: \"6022714c-eabe-49a9-b794-0b7a0097b816\") " pod="openstack/ovn-northd-0" Nov 22 04:25:23 crc kubenswrapper[4699]: I1122 04:25:23.768347 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6022714c-eabe-49a9-b794-0b7a0097b816-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6022714c-eabe-49a9-b794-0b7a0097b816\") " pod="openstack/ovn-northd-0" Nov 22 04:25:23 crc kubenswrapper[4699]: I1122 04:25:23.768353 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6022714c-eabe-49a9-b794-0b7a0097b816-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6022714c-eabe-49a9-b794-0b7a0097b816\") " pod="openstack/ovn-northd-0" Nov 22 04:25:23 crc kubenswrapper[4699]: I1122 04:25:23.768520 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6022714c-eabe-49a9-b794-0b7a0097b816-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6022714c-eabe-49a9-b794-0b7a0097b816\") " pod="openstack/ovn-northd-0" Nov 22 04:25:23 crc kubenswrapper[4699]: I1122 04:25:23.784399 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cgqm\" (UniqueName: \"kubernetes.io/projected/6022714c-eabe-49a9-b794-0b7a0097b816-kube-api-access-8cgqm\") pod \"ovn-northd-0\" (UID: \"6022714c-eabe-49a9-b794-0b7a0097b816\") " pod="openstack/ovn-northd-0" Nov 22 04:25:23 crc kubenswrapper[4699]: I1122 04:25:23.959477 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 22 04:25:24 crc kubenswrapper[4699]: I1122 04:25:24.260243 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e74585bc-d1cf-473d-95ca-12c816ff0020","Type":"ContainerStarted","Data":"d0b9b11e86032bdf78a9f5421742a0e57141de6109d2f50760f7e2d49838af3b"} Nov 22 04:25:24 crc kubenswrapper[4699]: I1122 04:25:24.282884 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=-9223371957.57191 podStartE2EDuration="1m19.282865632s" podCreationTimestamp="2025-11-22 04:24:05 +0000 UTC" firstStartedPulling="2025-11-22 04:24:07.566637873 +0000 UTC m=+998.909259060" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:25:24.278503727 +0000 UTC m=+1075.621124944" watchObservedRunningTime="2025-11-22 04:25:24.282865632 +0000 UTC m=+1075.625486809" Nov 22 04:25:24 crc kubenswrapper[4699]: I1122 04:25:24.396026 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Nov 22 04:25:25 crc kubenswrapper[4699]: I1122 04:25:25.268802 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6022714c-eabe-49a9-b794-0b7a0097b816","Type":"ContainerStarted","Data":"be4947699f06e8ac907cb2221c6fde87d1abc3ed1eb9e777ae919ed03771e5e3"} Nov 22 04:25:26 crc kubenswrapper[4699]: I1122 04:25:26.427628 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-81ad-account-create-jflt8"] Nov 22 04:25:26 crc kubenswrapper[4699]: I1122 04:25:26.429351 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-81ad-account-create-jflt8" Nov 22 04:25:26 crc kubenswrapper[4699]: I1122 04:25:26.431407 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Nov 22 04:25:26 crc kubenswrapper[4699]: I1122 04:25:26.436970 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-81ad-account-create-jflt8"] Nov 22 04:25:26 crc kubenswrapper[4699]: I1122 04:25:26.476057 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-p6fhv"] Nov 22 04:25:26 crc kubenswrapper[4699]: I1122 04:25:26.477998 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-p6fhv" Nov 22 04:25:26 crc kubenswrapper[4699]: I1122 04:25:26.486368 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-p6fhv"] Nov 22 04:25:26 crc kubenswrapper[4699]: I1122 04:25:26.528931 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6ppt\" (UniqueName: \"kubernetes.io/projected/9e3fa899-c823-4cab-8224-1ca3130f515a-kube-api-access-p6ppt\") pod \"keystone-db-create-p6fhv\" (UID: \"9e3fa899-c823-4cab-8224-1ca3130f515a\") " pod="openstack/keystone-db-create-p6fhv" Nov 22 04:25:26 crc kubenswrapper[4699]: I1122 04:25:26.529111 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8afe223c-55c0-40b3-aa14-ea52cad6bccc-operator-scripts\") pod \"keystone-81ad-account-create-jflt8\" (UID: \"8afe223c-55c0-40b3-aa14-ea52cad6bccc\") " pod="openstack/keystone-81ad-account-create-jflt8" Nov 22 04:25:26 crc kubenswrapper[4699]: I1122 04:25:26.529146 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e3fa899-c823-4cab-8224-1ca3130f515a-operator-scripts\") pod \"keystone-db-create-p6fhv\" (UID: \"9e3fa899-c823-4cab-8224-1ca3130f515a\") " pod="openstack/keystone-db-create-p6fhv" Nov 22 04:25:26 crc kubenswrapper[4699]: I1122 04:25:26.529281 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpqpx\" (UniqueName: \"kubernetes.io/projected/8afe223c-55c0-40b3-aa14-ea52cad6bccc-kube-api-access-zpqpx\") pod \"keystone-81ad-account-create-jflt8\" (UID: \"8afe223c-55c0-40b3-aa14-ea52cad6bccc\") " pod="openstack/keystone-81ad-account-create-jflt8" Nov 22 04:25:26 crc kubenswrapper[4699]: I1122 04:25:26.631547 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpqpx\" (UniqueName: \"kubernetes.io/projected/8afe223c-55c0-40b3-aa14-ea52cad6bccc-kube-api-access-zpqpx\") pod \"keystone-81ad-account-create-jflt8\" (UID: \"8afe223c-55c0-40b3-aa14-ea52cad6bccc\") " pod="openstack/keystone-81ad-account-create-jflt8" Nov 22 04:25:26 crc kubenswrapper[4699]: I1122 04:25:26.631685 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6ppt\" (UniqueName: \"kubernetes.io/projected/9e3fa899-c823-4cab-8224-1ca3130f515a-kube-api-access-p6ppt\") pod \"keystone-db-create-p6fhv\" (UID: \"9e3fa899-c823-4cab-8224-1ca3130f515a\") " pod="openstack/keystone-db-create-p6fhv" Nov 22 04:25:26 crc kubenswrapper[4699]: I1122 04:25:26.631736 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8afe223c-55c0-40b3-aa14-ea52cad6bccc-operator-scripts\") pod \"keystone-81ad-account-create-jflt8\" (UID: \"8afe223c-55c0-40b3-aa14-ea52cad6bccc\") " pod="openstack/keystone-81ad-account-create-jflt8" Nov 22 04:25:26 crc kubenswrapper[4699]: I1122 04:25:26.631762 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e3fa899-c823-4cab-8224-1ca3130f515a-operator-scripts\") pod \"keystone-db-create-p6fhv\" (UID: \"9e3fa899-c823-4cab-8224-1ca3130f515a\") " pod="openstack/keystone-db-create-p6fhv" Nov 22 04:25:26 crc kubenswrapper[4699]: I1122 04:25:26.632588 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8afe223c-55c0-40b3-aa14-ea52cad6bccc-operator-scripts\") pod \"keystone-81ad-account-create-jflt8\" (UID: \"8afe223c-55c0-40b3-aa14-ea52cad6bccc\") " pod="openstack/keystone-81ad-account-create-jflt8" Nov 22 04:25:26 crc kubenswrapper[4699]: I1122 04:25:26.632591 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e3fa899-c823-4cab-8224-1ca3130f515a-operator-scripts\") pod \"keystone-db-create-p6fhv\" (UID: \"9e3fa899-c823-4cab-8224-1ca3130f515a\") " pod="openstack/keystone-db-create-p6fhv" Nov 22 04:25:26 crc kubenswrapper[4699]: I1122 04:25:26.651255 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6ppt\" (UniqueName: \"kubernetes.io/projected/9e3fa899-c823-4cab-8224-1ca3130f515a-kube-api-access-p6ppt\") pod \"keystone-db-create-p6fhv\" (UID: \"9e3fa899-c823-4cab-8224-1ca3130f515a\") " pod="openstack/keystone-db-create-p6fhv" Nov 22 04:25:26 crc kubenswrapper[4699]: I1122 04:25:26.652346 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-ck2mh"] Nov 22 04:25:26 crc kubenswrapper[4699]: I1122 04:25:26.653465 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-ck2mh" Nov 22 04:25:26 crc kubenswrapper[4699]: I1122 04:25:26.656935 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpqpx\" (UniqueName: \"kubernetes.io/projected/8afe223c-55c0-40b3-aa14-ea52cad6bccc-kube-api-access-zpqpx\") pod \"keystone-81ad-account-create-jflt8\" (UID: \"8afe223c-55c0-40b3-aa14-ea52cad6bccc\") " pod="openstack/keystone-81ad-account-create-jflt8" Nov 22 04:25:26 crc kubenswrapper[4699]: I1122 04:25:26.665024 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-ck2mh"] Nov 22 04:25:26 crc kubenswrapper[4699]: I1122 04:25:26.733103 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb649\" (UniqueName: \"kubernetes.io/projected/73de98eb-db4a-47f1-b23a-aa38b2db9078-kube-api-access-lb649\") pod \"placement-db-create-ck2mh\" (UID: \"73de98eb-db4a-47f1-b23a-aa38b2db9078\") " pod="openstack/placement-db-create-ck2mh" Nov 22 04:25:26 crc kubenswrapper[4699]: I1122 04:25:26.733255 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73de98eb-db4a-47f1-b23a-aa38b2db9078-operator-scripts\") pod \"placement-db-create-ck2mh\" (UID: \"73de98eb-db4a-47f1-b23a-aa38b2db9078\") " pod="openstack/placement-db-create-ck2mh" Nov 22 04:25:26 crc kubenswrapper[4699]: I1122 04:25:26.752328 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-81ad-account-create-jflt8" Nov 22 04:25:26 crc kubenswrapper[4699]: I1122 04:25:26.763863 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-f2d9-account-create-9fnts"] Nov 22 04:25:26 crc kubenswrapper[4699]: I1122 04:25:26.765160 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f2d9-account-create-9fnts" Nov 22 04:25:26 crc kubenswrapper[4699]: I1122 04:25:26.781528 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-f2d9-account-create-9fnts"] Nov 22 04:25:26 crc kubenswrapper[4699]: I1122 04:25:26.783723 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Nov 22 04:25:26 crc kubenswrapper[4699]: I1122 04:25:26.804644 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Nov 22 04:25:26 crc kubenswrapper[4699]: I1122 04:25:26.807913 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Nov 22 04:25:26 crc kubenswrapper[4699]: I1122 04:25:26.809140 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-p6fhv" Nov 22 04:25:26 crc kubenswrapper[4699]: I1122 04:25:26.819747 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Nov 22 04:25:26 crc kubenswrapper[4699]: I1122 04:25:26.819801 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Nov 22 04:25:26 crc kubenswrapper[4699]: I1122 04:25:26.845894 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73de98eb-db4a-47f1-b23a-aa38b2db9078-operator-scripts\") pod \"placement-db-create-ck2mh\" (UID: \"73de98eb-db4a-47f1-b23a-aa38b2db9078\") " pod="openstack/placement-db-create-ck2mh" Nov 22 04:25:26 crc kubenswrapper[4699]: I1122 04:25:26.849604 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73de98eb-db4a-47f1-b23a-aa38b2db9078-operator-scripts\") pod \"placement-db-create-ck2mh\" (UID: \"73de98eb-db4a-47f1-b23a-aa38b2db9078\") " pod="openstack/placement-db-create-ck2mh" Nov 22 04:25:26 crc kubenswrapper[4699]: I1122 04:25:26.852229 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lb649\" (UniqueName: \"kubernetes.io/projected/73de98eb-db4a-47f1-b23a-aa38b2db9078-kube-api-access-lb649\") pod \"placement-db-create-ck2mh\" (UID: \"73de98eb-db4a-47f1-b23a-aa38b2db9078\") " pod="openstack/placement-db-create-ck2mh" Nov 22 04:25:26 crc kubenswrapper[4699]: I1122 04:25:26.852329 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqk42\" (UniqueName: \"kubernetes.io/projected/8070964f-baae-4437-b0b7-2ff91608f0d7-kube-api-access-vqk42\") pod \"placement-f2d9-account-create-9fnts\" (UID: \"8070964f-baae-4437-b0b7-2ff91608f0d7\") " pod="openstack/placement-f2d9-account-create-9fnts" Nov 22 04:25:26 crc kubenswrapper[4699]: I1122 04:25:26.854517 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8070964f-baae-4437-b0b7-2ff91608f0d7-operator-scripts\") pod \"placement-f2d9-account-create-9fnts\" (UID: \"8070964f-baae-4437-b0b7-2ff91608f0d7\") " pod="openstack/placement-f2d9-account-create-9fnts" Nov 22 04:25:26 crc kubenswrapper[4699]: I1122 04:25:26.875755 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb649\" (UniqueName: \"kubernetes.io/projected/73de98eb-db4a-47f1-b23a-aa38b2db9078-kube-api-access-lb649\") pod \"placement-db-create-ck2mh\" (UID: \"73de98eb-db4a-47f1-b23a-aa38b2db9078\") " pod="openstack/placement-db-create-ck2mh" Nov 22 04:25:26 crc kubenswrapper[4699]: I1122 04:25:26.957784 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqk42\" (UniqueName: \"kubernetes.io/projected/8070964f-baae-4437-b0b7-2ff91608f0d7-kube-api-access-vqk42\") pod \"placement-f2d9-account-create-9fnts\" (UID: \"8070964f-baae-4437-b0b7-2ff91608f0d7\") " pod="openstack/placement-f2d9-account-create-9fnts" Nov 22 04:25:26 crc kubenswrapper[4699]: I1122 04:25:26.957880 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8070964f-baae-4437-b0b7-2ff91608f0d7-operator-scripts\") pod \"placement-f2d9-account-create-9fnts\" (UID: \"8070964f-baae-4437-b0b7-2ff91608f0d7\") " pod="openstack/placement-f2d9-account-create-9fnts" Nov 22 04:25:26 crc kubenswrapper[4699]: I1122 04:25:26.959416 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8070964f-baae-4437-b0b7-2ff91608f0d7-operator-scripts\") pod \"placement-f2d9-account-create-9fnts\" (UID: \"8070964f-baae-4437-b0b7-2ff91608f0d7\") " pod="openstack/placement-f2d9-account-create-9fnts" Nov 22 04:25:26 crc kubenswrapper[4699]: I1122 04:25:26.993207 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqk42\" (UniqueName: \"kubernetes.io/projected/8070964f-baae-4437-b0b7-2ff91608f0d7-kube-api-access-vqk42\") pod \"placement-f2d9-account-create-9fnts\" (UID: \"8070964f-baae-4437-b0b7-2ff91608f0d7\") " pod="openstack/placement-f2d9-account-create-9fnts" Nov 22 04:25:27 crc kubenswrapper[4699]: I1122 04:25:27.040796 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-ck2mh" Nov 22 04:25:27 crc kubenswrapper[4699]: I1122 04:25:27.065926 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-mnjhh"] Nov 22 04:25:27 crc kubenswrapper[4699]: I1122 04:25:27.067853 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-mnjhh" Nov 22 04:25:27 crc kubenswrapper[4699]: I1122 04:25:27.081734 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-mnjhh"] Nov 22 04:25:27 crc kubenswrapper[4699]: I1122 04:25:27.161361 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe7d25aa-4c77-48b6-88fe-11339dbca63a-operator-scripts\") pod \"glance-db-create-mnjhh\" (UID: \"fe7d25aa-4c77-48b6-88fe-11339dbca63a\") " pod="openstack/glance-db-create-mnjhh" Nov 22 04:25:27 crc kubenswrapper[4699]: I1122 04:25:27.161494 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87k6h\" (UniqueName: \"kubernetes.io/projected/fe7d25aa-4c77-48b6-88fe-11339dbca63a-kube-api-access-87k6h\") pod \"glance-db-create-mnjhh\" (UID: \"fe7d25aa-4c77-48b6-88fe-11339dbca63a\") " pod="openstack/glance-db-create-mnjhh" Nov 22 04:25:27 crc kubenswrapper[4699]: I1122 04:25:27.179735 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-1642-account-create-ht5w7"] Nov 22 04:25:27 crc kubenswrapper[4699]: I1122 04:25:27.181152 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1642-account-create-ht5w7" Nov 22 04:25:27 crc kubenswrapper[4699]: I1122 04:25:27.191422 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Nov 22 04:25:27 crc kubenswrapper[4699]: I1122 04:25:27.199345 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f2d9-account-create-9fnts" Nov 22 04:25:27 crc kubenswrapper[4699]: I1122 04:25:27.223418 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-1642-account-create-ht5w7"] Nov 22 04:25:27 crc kubenswrapper[4699]: I1122 04:25:27.262754 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87k6h\" (UniqueName: \"kubernetes.io/projected/fe7d25aa-4c77-48b6-88fe-11339dbca63a-kube-api-access-87k6h\") pod \"glance-db-create-mnjhh\" (UID: \"fe7d25aa-4c77-48b6-88fe-11339dbca63a\") " pod="openstack/glance-db-create-mnjhh" Nov 22 04:25:27 crc kubenswrapper[4699]: I1122 04:25:27.262800 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbg4h\" (UniqueName: \"kubernetes.io/projected/c99675e1-93f6-4b73-b4cb-e8f096c3c16e-kube-api-access-lbg4h\") pod \"glance-1642-account-create-ht5w7\" (UID: \"c99675e1-93f6-4b73-b4cb-e8f096c3c16e\") " pod="openstack/glance-1642-account-create-ht5w7" Nov 22 04:25:27 crc kubenswrapper[4699]: I1122 04:25:27.262860 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c99675e1-93f6-4b73-b4cb-e8f096c3c16e-operator-scripts\") pod \"glance-1642-account-create-ht5w7\" (UID: \"c99675e1-93f6-4b73-b4cb-e8f096c3c16e\") " pod="openstack/glance-1642-account-create-ht5w7" Nov 22 04:25:27 crc kubenswrapper[4699]: I1122 04:25:27.263023 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe7d25aa-4c77-48b6-88fe-11339dbca63a-operator-scripts\") pod \"glance-db-create-mnjhh\" (UID: \"fe7d25aa-4c77-48b6-88fe-11339dbca63a\") " pod="openstack/glance-db-create-mnjhh" Nov 22 04:25:27 crc kubenswrapper[4699]: I1122 04:25:27.264449 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe7d25aa-4c77-48b6-88fe-11339dbca63a-operator-scripts\") pod \"glance-db-create-mnjhh\" (UID: \"fe7d25aa-4c77-48b6-88fe-11339dbca63a\") " pod="openstack/glance-db-create-mnjhh" Nov 22 04:25:27 crc kubenswrapper[4699]: I1122 04:25:27.285559 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87k6h\" (UniqueName: \"kubernetes.io/projected/fe7d25aa-4c77-48b6-88fe-11339dbca63a-kube-api-access-87k6h\") pod \"glance-db-create-mnjhh\" (UID: \"fe7d25aa-4c77-48b6-88fe-11339dbca63a\") " pod="openstack/glance-db-create-mnjhh" Nov 22 04:25:27 crc kubenswrapper[4699]: I1122 04:25:27.308705 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fd796d7df-ztht2" Nov 22 04:25:27 crc kubenswrapper[4699]: I1122 04:25:27.365298 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbg4h\" (UniqueName: \"kubernetes.io/projected/c99675e1-93f6-4b73-b4cb-e8f096c3c16e-kube-api-access-lbg4h\") pod \"glance-1642-account-create-ht5w7\" (UID: \"c99675e1-93f6-4b73-b4cb-e8f096c3c16e\") " pod="openstack/glance-1642-account-create-ht5w7" Nov 22 04:25:27 crc kubenswrapper[4699]: I1122 04:25:27.365455 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c99675e1-93f6-4b73-b4cb-e8f096c3c16e-operator-scripts\") pod \"glance-1642-account-create-ht5w7\" (UID: \"c99675e1-93f6-4b73-b4cb-e8f096c3c16e\") " pod="openstack/glance-1642-account-create-ht5w7" Nov 22 04:25:27 crc kubenswrapper[4699]: I1122 04:25:27.367214 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c99675e1-93f6-4b73-b4cb-e8f096c3c16e-operator-scripts\") pod \"glance-1642-account-create-ht5w7\" (UID: \"c99675e1-93f6-4b73-b4cb-e8f096c3c16e\") " pod="openstack/glance-1642-account-create-ht5w7" Nov 22 04:25:27 crc kubenswrapper[4699]: I1122 04:25:27.391679 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbg4h\" (UniqueName: \"kubernetes.io/projected/c99675e1-93f6-4b73-b4cb-e8f096c3c16e-kube-api-access-lbg4h\") pod \"glance-1642-account-create-ht5w7\" (UID: \"c99675e1-93f6-4b73-b4cb-e8f096c3c16e\") " pod="openstack/glance-1642-account-create-ht5w7" Nov 22 04:25:27 crc kubenswrapper[4699]: I1122 04:25:27.394256 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-mnjhh" Nov 22 04:25:27 crc kubenswrapper[4699]: I1122 04:25:27.397620 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-81ad-account-create-jflt8"] Nov 22 04:25:27 crc kubenswrapper[4699]: I1122 04:25:27.503830 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1642-account-create-ht5w7" Nov 22 04:25:27 crc kubenswrapper[4699]: W1122 04:25:27.506973 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8afe223c_55c0_40b3_aa14_ea52cad6bccc.slice/crio-3463cc88d7a91338f6ab0e6ad3d1845fa66b535f58ee1396eda989333499fc43 WatchSource:0}: Error finding container 3463cc88d7a91338f6ab0e6ad3d1845fa66b535f58ee1396eda989333499fc43: Status 404 returned error can't find the container with id 3463cc88d7a91338f6ab0e6ad3d1845fa66b535f58ee1396eda989333499fc43 Nov 22 04:25:27 crc kubenswrapper[4699]: I1122 04:25:27.535876 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-2glrx" Nov 22 04:25:27 crc kubenswrapper[4699]: I1122 04:25:27.549883 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-p6fhv"] Nov 22 04:25:27 crc kubenswrapper[4699]: I1122 04:25:27.670279 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-ztht2"] Nov 22 04:25:27 crc kubenswrapper[4699]: I1122 04:25:27.722743 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-ck2mh"] Nov 22 04:25:27 crc kubenswrapper[4699]: I1122 04:25:27.757558 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-f2d9-account-create-9fnts"] Nov 22 04:25:27 crc kubenswrapper[4699]: I1122 04:25:27.946330 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-1642-account-create-ht5w7"] Nov 22 04:25:27 crc kubenswrapper[4699]: W1122 04:25:27.959751 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc99675e1_93f6_4b73_b4cb_e8f096c3c16e.slice/crio-b938e23aa7a2692ac4a03ab3e4869276dcde61d97b9cc6b36686f0f0e20ce3a5 WatchSource:0}: Error finding container b938e23aa7a2692ac4a03ab3e4869276dcde61d97b9cc6b36686f0f0e20ce3a5: Status 404 returned error can't find the container with id b938e23aa7a2692ac4a03ab3e4869276dcde61d97b9cc6b36686f0f0e20ce3a5 Nov 22 04:25:28 crc kubenswrapper[4699]: I1122 04:25:28.079578 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-mnjhh"] Nov 22 04:25:28 crc kubenswrapper[4699]: W1122 04:25:28.147507 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe7d25aa_4c77_48b6_88fe_11339dbca63a.slice/crio-06c1a58a8d4e1eb48a3058172821e63b893287ceda6779dc4f9ac86036e7633d WatchSource:0}: Error finding container 06c1a58a8d4e1eb48a3058172821e63b893287ceda6779dc4f9ac86036e7633d: Status 404 returned error can't find the container with id 06c1a58a8d4e1eb48a3058172821e63b893287ceda6779dc4f9ac86036e7633d Nov 22 04:25:28 crc kubenswrapper[4699]: I1122 04:25:28.293110 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1642-account-create-ht5w7" event={"ID":"c99675e1-93f6-4b73-b4cb-e8f096c3c16e","Type":"ContainerStarted","Data":"b938e23aa7a2692ac4a03ab3e4869276dcde61d97b9cc6b36686f0f0e20ce3a5"} Nov 22 04:25:28 crc kubenswrapper[4699]: I1122 04:25:28.294845 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-mnjhh" event={"ID":"fe7d25aa-4c77-48b6-88fe-11339dbca63a","Type":"ContainerStarted","Data":"06c1a58a8d4e1eb48a3058172821e63b893287ceda6779dc4f9ac86036e7633d"} Nov 22 04:25:28 crc kubenswrapper[4699]: I1122 04:25:28.296055 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f2d9-account-create-9fnts" event={"ID":"8070964f-baae-4437-b0b7-2ff91608f0d7","Type":"ContainerStarted","Data":"8839bbea3f91000a46dde2a854ff60eb9fa34ff7b770bfb0620bf6435edeaae8"} Nov 22 04:25:28 crc kubenswrapper[4699]: I1122 04:25:28.297139 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-p6fhv" event={"ID":"9e3fa899-c823-4cab-8224-1ca3130f515a","Type":"ContainerStarted","Data":"3d4ce34c365c691a487edf56de2f6aed819c6a35ed2ccf02231f29422a2b37ce"} Nov 22 04:25:28 crc kubenswrapper[4699]: I1122 04:25:28.298088 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-81ad-account-create-jflt8" event={"ID":"8afe223c-55c0-40b3-aa14-ea52cad6bccc","Type":"ContainerStarted","Data":"3463cc88d7a91338f6ab0e6ad3d1845fa66b535f58ee1396eda989333499fc43"} Nov 22 04:25:28 crc kubenswrapper[4699]: I1122 04:25:28.299187 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-ck2mh" event={"ID":"73de98eb-db4a-47f1-b23a-aa38b2db9078","Type":"ContainerStarted","Data":"ea5c33e7a0e256de3c2af070fca67b02887e23e893da1e4b339b2ca2d5338684"} Nov 22 04:25:28 crc kubenswrapper[4699]: I1122 04:25:28.299402 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fd796d7df-ztht2" podUID="c1137043-00d5-43bd-a4e1-6cdc2c17fb88" containerName="dnsmasq-dns" containerID="cri-o://36b603b3480a668c4237d82d61ed45657b30f48df00cdbf4ba61b5431828ae34" gracePeriod=10 Nov 22 04:25:28 crc kubenswrapper[4699]: I1122 04:25:28.729426 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-bdb7g"] Nov 22 04:25:28 crc kubenswrapper[4699]: I1122 04:25:28.730707 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-bdb7g" Nov 22 04:25:28 crc kubenswrapper[4699]: I1122 04:25:28.759555 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-bdb7g"] Nov 22 04:25:28 crc kubenswrapper[4699]: I1122 04:25:28.794077 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5150cdc8-40db-4421-bbcd-16213ce14b2e-dns-svc\") pod \"dnsmasq-dns-698758b865-bdb7g\" (UID: \"5150cdc8-40db-4421-bbcd-16213ce14b2e\") " pod="openstack/dnsmasq-dns-698758b865-bdb7g" Nov 22 04:25:28 crc kubenswrapper[4699]: I1122 04:25:28.794163 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5150cdc8-40db-4421-bbcd-16213ce14b2e-config\") pod \"dnsmasq-dns-698758b865-bdb7g\" (UID: \"5150cdc8-40db-4421-bbcd-16213ce14b2e\") " pod="openstack/dnsmasq-dns-698758b865-bdb7g" Nov 22 04:25:28 crc kubenswrapper[4699]: I1122 04:25:28.794210 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5150cdc8-40db-4421-bbcd-16213ce14b2e-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-bdb7g\" (UID: \"5150cdc8-40db-4421-bbcd-16213ce14b2e\") " pod="openstack/dnsmasq-dns-698758b865-bdb7g" Nov 22 04:25:28 crc kubenswrapper[4699]: I1122 04:25:28.794256 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgmz8\" (UniqueName: \"kubernetes.io/projected/5150cdc8-40db-4421-bbcd-16213ce14b2e-kube-api-access-vgmz8\") pod \"dnsmasq-dns-698758b865-bdb7g\" (UID: \"5150cdc8-40db-4421-bbcd-16213ce14b2e\") " pod="openstack/dnsmasq-dns-698758b865-bdb7g" Nov 22 04:25:28 crc kubenswrapper[4699]: I1122 04:25:28.794289 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5150cdc8-40db-4421-bbcd-16213ce14b2e-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-bdb7g\" (UID: \"5150cdc8-40db-4421-bbcd-16213ce14b2e\") " pod="openstack/dnsmasq-dns-698758b865-bdb7g" Nov 22 04:25:28 crc kubenswrapper[4699]: I1122 04:25:28.847198 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Nov 22 04:25:28 crc kubenswrapper[4699]: I1122 04:25:28.895726 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5150cdc8-40db-4421-bbcd-16213ce14b2e-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-bdb7g\" (UID: \"5150cdc8-40db-4421-bbcd-16213ce14b2e\") " pod="openstack/dnsmasq-dns-698758b865-bdb7g" Nov 22 04:25:28 crc kubenswrapper[4699]: I1122 04:25:28.895851 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5150cdc8-40db-4421-bbcd-16213ce14b2e-dns-svc\") pod \"dnsmasq-dns-698758b865-bdb7g\" (UID: \"5150cdc8-40db-4421-bbcd-16213ce14b2e\") " pod="openstack/dnsmasq-dns-698758b865-bdb7g" Nov 22 04:25:28 crc kubenswrapper[4699]: I1122 04:25:28.895924 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5150cdc8-40db-4421-bbcd-16213ce14b2e-config\") pod \"dnsmasq-dns-698758b865-bdb7g\" (UID: \"5150cdc8-40db-4421-bbcd-16213ce14b2e\") " pod="openstack/dnsmasq-dns-698758b865-bdb7g" Nov 22 04:25:28 crc kubenswrapper[4699]: I1122 04:25:28.895969 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5150cdc8-40db-4421-bbcd-16213ce14b2e-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-bdb7g\" (UID: \"5150cdc8-40db-4421-bbcd-16213ce14b2e\") " pod="openstack/dnsmasq-dns-698758b865-bdb7g" Nov 22 04:25:28 crc kubenswrapper[4699]: I1122 04:25:28.896036 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgmz8\" (UniqueName: \"kubernetes.io/projected/5150cdc8-40db-4421-bbcd-16213ce14b2e-kube-api-access-vgmz8\") pod \"dnsmasq-dns-698758b865-bdb7g\" (UID: \"5150cdc8-40db-4421-bbcd-16213ce14b2e\") " pod="openstack/dnsmasq-dns-698758b865-bdb7g" Nov 22 04:25:28 crc kubenswrapper[4699]: I1122 04:25:28.897299 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5150cdc8-40db-4421-bbcd-16213ce14b2e-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-bdb7g\" (UID: \"5150cdc8-40db-4421-bbcd-16213ce14b2e\") " pod="openstack/dnsmasq-dns-698758b865-bdb7g" Nov 22 04:25:28 crc kubenswrapper[4699]: I1122 04:25:28.897689 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5150cdc8-40db-4421-bbcd-16213ce14b2e-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-bdb7g\" (UID: \"5150cdc8-40db-4421-bbcd-16213ce14b2e\") " pod="openstack/dnsmasq-dns-698758b865-bdb7g" Nov 22 04:25:28 crc kubenswrapper[4699]: I1122 04:25:28.901026 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5150cdc8-40db-4421-bbcd-16213ce14b2e-config\") pod \"dnsmasq-dns-698758b865-bdb7g\" (UID: \"5150cdc8-40db-4421-bbcd-16213ce14b2e\") " pod="openstack/dnsmasq-dns-698758b865-bdb7g" Nov 22 04:25:28 crc kubenswrapper[4699]: I1122 04:25:28.904234 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5150cdc8-40db-4421-bbcd-16213ce14b2e-dns-svc\") pod \"dnsmasq-dns-698758b865-bdb7g\" (UID: \"5150cdc8-40db-4421-bbcd-16213ce14b2e\") " pod="openstack/dnsmasq-dns-698758b865-bdb7g" Nov 22 04:25:28 crc kubenswrapper[4699]: I1122 04:25:28.945534 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgmz8\" (UniqueName: \"kubernetes.io/projected/5150cdc8-40db-4421-bbcd-16213ce14b2e-kube-api-access-vgmz8\") pod \"dnsmasq-dns-698758b865-bdb7g\" (UID: \"5150cdc8-40db-4421-bbcd-16213ce14b2e\") " pod="openstack/dnsmasq-dns-698758b865-bdb7g" Nov 22 04:25:29 crc kubenswrapper[4699]: I1122 04:25:29.053038 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-bdb7g" Nov 22 04:25:29 crc kubenswrapper[4699]: I1122 04:25:29.312386 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-81ad-account-create-jflt8" event={"ID":"8afe223c-55c0-40b3-aa14-ea52cad6bccc","Type":"ContainerStarted","Data":"c34babc09fa972896b17116ea685f90d6a7b38dce92f5a6f8a3fd517e7d2fa98"} Nov 22 04:25:29 crc kubenswrapper[4699]: I1122 04:25:29.316404 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-ck2mh" event={"ID":"73de98eb-db4a-47f1-b23a-aa38b2db9078","Type":"ContainerStarted","Data":"03cac1d004eeb36b5f78d4c545557cca1039054d9489fe9752da173743c5aaf6"} Nov 22 04:25:29 crc kubenswrapper[4699]: I1122 04:25:29.322426 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1642-account-create-ht5w7" event={"ID":"c99675e1-93f6-4b73-b4cb-e8f096c3c16e","Type":"ContainerStarted","Data":"8b6ba740e45459f2bdbb9721045729d9604eddcf85137656d66d2083a581dbfb"} Nov 22 04:25:29 crc kubenswrapper[4699]: I1122 04:25:29.325165 4699 generic.go:334] "Generic (PLEG): container finished" podID="c1137043-00d5-43bd-a4e1-6cdc2c17fb88" containerID="36b603b3480a668c4237d82d61ed45657b30f48df00cdbf4ba61b5431828ae34" exitCode=0 Nov 22 04:25:29 crc kubenswrapper[4699]: I1122 04:25:29.325230 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-ztht2" event={"ID":"c1137043-00d5-43bd-a4e1-6cdc2c17fb88","Type":"ContainerDied","Data":"36b603b3480a668c4237d82d61ed45657b30f48df00cdbf4ba61b5431828ae34"} Nov 22 04:25:29 crc kubenswrapper[4699]: I1122 04:25:29.327214 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-mnjhh" event={"ID":"fe7d25aa-4c77-48b6-88fe-11339dbca63a","Type":"ContainerStarted","Data":"50684eda9651e67a2aa904aa2ecb7f8fd420565e07461db74e9efca2b0530472"} Nov 22 04:25:29 crc kubenswrapper[4699]: I1122 04:25:29.329529 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f2d9-account-create-9fnts" event={"ID":"8070964f-baae-4437-b0b7-2ff91608f0d7","Type":"ContainerStarted","Data":"4c2ddb86338033d36a854e6a597e4f32f3f958d00f97a4e665fe3ed6fb5b5942"} Nov 22 04:25:29 crc kubenswrapper[4699]: I1122 04:25:29.330684 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-p6fhv" event={"ID":"9e3fa899-c823-4cab-8224-1ca3130f515a","Type":"ContainerStarted","Data":"315df20183e7724f4f44f825f46816d1c0546d85f93272ae82319dcc710d1f28"} Nov 22 04:25:29 crc kubenswrapper[4699]: I1122 04:25:29.664993 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-bdb7g"] Nov 22 04:25:29 crc kubenswrapper[4699]: I1122 04:25:29.863994 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Nov 22 04:25:29 crc kubenswrapper[4699]: I1122 04:25:29.872943 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Nov 22 04:25:29 crc kubenswrapper[4699]: I1122 04:25:29.875471 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-dpqhp" Nov 22 04:25:29 crc kubenswrapper[4699]: I1122 04:25:29.876016 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Nov 22 04:25:29 crc kubenswrapper[4699]: I1122 04:25:29.876282 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Nov 22 04:25:29 crc kubenswrapper[4699]: I1122 04:25:29.877674 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Nov 22 04:25:29 crc kubenswrapper[4699]: I1122 04:25:29.883638 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Nov 22 04:25:30 crc kubenswrapper[4699]: I1122 04:25:30.039447 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ff24634a-9171-4a5c-b045-4c653e032c18-cache\") pod \"swift-storage-0\" (UID: \"ff24634a-9171-4a5c-b045-4c653e032c18\") " pod="openstack/swift-storage-0" Nov 22 04:25:30 crc kubenswrapper[4699]: I1122 04:25:30.039511 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"ff24634a-9171-4a5c-b045-4c653e032c18\") " pod="openstack/swift-storage-0" Nov 22 04:25:30 crc kubenswrapper[4699]: I1122 04:25:30.039533 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ff24634a-9171-4a5c-b045-4c653e032c18-etc-swift\") pod \"swift-storage-0\" (UID: \"ff24634a-9171-4a5c-b045-4c653e032c18\") " pod="openstack/swift-storage-0" Nov 22 04:25:30 crc kubenswrapper[4699]: I1122 04:25:30.039566 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncb2j\" (UniqueName: \"kubernetes.io/projected/ff24634a-9171-4a5c-b045-4c653e032c18-kube-api-access-ncb2j\") pod \"swift-storage-0\" (UID: \"ff24634a-9171-4a5c-b045-4c653e032c18\") " pod="openstack/swift-storage-0" Nov 22 04:25:30 crc kubenswrapper[4699]: I1122 04:25:30.039617 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ff24634a-9171-4a5c-b045-4c653e032c18-lock\") pod \"swift-storage-0\" (UID: \"ff24634a-9171-4a5c-b045-4c653e032c18\") " pod="openstack/swift-storage-0" Nov 22 04:25:30 crc kubenswrapper[4699]: I1122 04:25:30.141364 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ff24634a-9171-4a5c-b045-4c653e032c18-cache\") pod \"swift-storage-0\" (UID: \"ff24634a-9171-4a5c-b045-4c653e032c18\") " pod="openstack/swift-storage-0" Nov 22 04:25:30 crc kubenswrapper[4699]: I1122 04:25:30.141425 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"ff24634a-9171-4a5c-b045-4c653e032c18\") " pod="openstack/swift-storage-0" Nov 22 04:25:30 crc kubenswrapper[4699]: I1122 04:25:30.141503 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ff24634a-9171-4a5c-b045-4c653e032c18-etc-swift\") pod \"swift-storage-0\" (UID: \"ff24634a-9171-4a5c-b045-4c653e032c18\") " pod="openstack/swift-storage-0" Nov 22 04:25:30 crc kubenswrapper[4699]: I1122 04:25:30.141537 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncb2j\" (UniqueName: \"kubernetes.io/projected/ff24634a-9171-4a5c-b045-4c653e032c18-kube-api-access-ncb2j\") pod \"swift-storage-0\" (UID: \"ff24634a-9171-4a5c-b045-4c653e032c18\") " pod="openstack/swift-storage-0" Nov 22 04:25:30 crc kubenswrapper[4699]: I1122 04:25:30.141589 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ff24634a-9171-4a5c-b045-4c653e032c18-lock\") pod \"swift-storage-0\" (UID: \"ff24634a-9171-4a5c-b045-4c653e032c18\") " pod="openstack/swift-storage-0" Nov 22 04:25:30 crc kubenswrapper[4699]: E1122 04:25:30.141700 4699 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 22 04:25:30 crc kubenswrapper[4699]: E1122 04:25:30.141732 4699 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 22 04:25:30 crc kubenswrapper[4699]: E1122 04:25:30.141787 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ff24634a-9171-4a5c-b045-4c653e032c18-etc-swift podName:ff24634a-9171-4a5c-b045-4c653e032c18 nodeName:}" failed. No retries permitted until 2025-11-22 04:25:30.641765611 +0000 UTC m=+1081.984386798 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ff24634a-9171-4a5c-b045-4c653e032c18-etc-swift") pod "swift-storage-0" (UID: "ff24634a-9171-4a5c-b045-4c653e032c18") : configmap "swift-ring-files" not found Nov 22 04:25:30 crc kubenswrapper[4699]: I1122 04:25:30.141817 4699 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"ff24634a-9171-4a5c-b045-4c653e032c18\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/swift-storage-0" Nov 22 04:25:30 crc kubenswrapper[4699]: I1122 04:25:30.142057 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ff24634a-9171-4a5c-b045-4c653e032c18-cache\") pod \"swift-storage-0\" (UID: \"ff24634a-9171-4a5c-b045-4c653e032c18\") " pod="openstack/swift-storage-0" Nov 22 04:25:30 crc kubenswrapper[4699]: I1122 04:25:30.142136 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ff24634a-9171-4a5c-b045-4c653e032c18-lock\") pod \"swift-storage-0\" (UID: \"ff24634a-9171-4a5c-b045-4c653e032c18\") " pod="openstack/swift-storage-0" Nov 22 04:25:30 crc kubenswrapper[4699]: I1122 04:25:30.158057 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncb2j\" (UniqueName: \"kubernetes.io/projected/ff24634a-9171-4a5c-b045-4c653e032c18-kube-api-access-ncb2j\") pod \"swift-storage-0\" (UID: \"ff24634a-9171-4a5c-b045-4c653e032c18\") " pod="openstack/swift-storage-0" Nov 22 04:25:30 crc kubenswrapper[4699]: I1122 04:25:30.173300 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"ff24634a-9171-4a5c-b045-4c653e032c18\") " pod="openstack/swift-storage-0" Nov 22 04:25:30 crc kubenswrapper[4699]: I1122 04:25:30.339796 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-bdb7g" event={"ID":"5150cdc8-40db-4421-bbcd-16213ce14b2e","Type":"ContainerStarted","Data":"cc40699197245008474c492eab670ed23d89b058441b1123e7f86b93c94138d6"} Nov 22 04:25:30 crc kubenswrapper[4699]: I1122 04:25:30.357673 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-f2d9-account-create-9fnts" podStartSLOduration=4.357654243 podStartE2EDuration="4.357654243s" podCreationTimestamp="2025-11-22 04:25:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:25:30.354625599 +0000 UTC m=+1081.697246796" watchObservedRunningTime="2025-11-22 04:25:30.357654243 +0000 UTC m=+1081.700275430" Nov 22 04:25:30 crc kubenswrapper[4699]: I1122 04:25:30.367974 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-81ad-account-create-jflt8" podStartSLOduration=4.367951582 podStartE2EDuration="4.367951582s" podCreationTimestamp="2025-11-22 04:25:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:25:30.366582349 +0000 UTC m=+1081.709203556" watchObservedRunningTime="2025-11-22 04:25:30.367951582 +0000 UTC m=+1081.710572769" Nov 22 04:25:30 crc kubenswrapper[4699]: I1122 04:25:30.393026 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-p6fhv" podStartSLOduration=4.393009168 podStartE2EDuration="4.393009168s" podCreationTimestamp="2025-11-22 04:25:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:25:30.387397792 +0000 UTC m=+1081.730018979" watchObservedRunningTime="2025-11-22 04:25:30.393009168 +0000 UTC m=+1081.735630355" Nov 22 04:25:30 crc kubenswrapper[4699]: I1122 04:25:30.406044 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-1642-account-create-ht5w7" podStartSLOduration=3.406025842 podStartE2EDuration="3.406025842s" podCreationTimestamp="2025-11-22 04:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:25:30.402784544 +0000 UTC m=+1081.745405751" watchObservedRunningTime="2025-11-22 04:25:30.406025842 +0000 UTC m=+1081.748647029" Nov 22 04:25:30 crc kubenswrapper[4699]: I1122 04:25:30.418668 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-mnjhh" podStartSLOduration=3.418650588 podStartE2EDuration="3.418650588s" podCreationTimestamp="2025-11-22 04:25:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:25:30.4166916 +0000 UTC m=+1081.759312797" watchObservedRunningTime="2025-11-22 04:25:30.418650588 +0000 UTC m=+1081.761271775" Nov 22 04:25:30 crc kubenswrapper[4699]: I1122 04:25:30.431737 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-ck2mh" podStartSLOduration=4.431717804 podStartE2EDuration="4.431717804s" podCreationTimestamp="2025-11-22 04:25:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:25:30.429638384 +0000 UTC m=+1081.772259591" watchObservedRunningTime="2025-11-22 04:25:30.431717804 +0000 UTC m=+1081.774338991" Nov 22 04:25:30 crc kubenswrapper[4699]: I1122 04:25:30.650698 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ff24634a-9171-4a5c-b045-4c653e032c18-etc-swift\") pod \"swift-storage-0\" (UID: \"ff24634a-9171-4a5c-b045-4c653e032c18\") " pod="openstack/swift-storage-0" Nov 22 04:25:30 crc kubenswrapper[4699]: E1122 04:25:30.650937 4699 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 22 04:25:30 crc kubenswrapper[4699]: E1122 04:25:30.651130 4699 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 22 04:25:30 crc kubenswrapper[4699]: E1122 04:25:30.651219 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ff24634a-9171-4a5c-b045-4c653e032c18-etc-swift podName:ff24634a-9171-4a5c-b045-4c653e032c18 nodeName:}" failed. No retries permitted until 2025-11-22 04:25:31.651193722 +0000 UTC m=+1082.993814909 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ff24634a-9171-4a5c-b045-4c653e032c18-etc-swift") pod "swift-storage-0" (UID: "ff24634a-9171-4a5c-b045-4c653e032c18") : configmap "swift-ring-files" not found Nov 22 04:25:30 crc kubenswrapper[4699]: I1122 04:25:30.662616 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-ztht2" Nov 22 04:25:30 crc kubenswrapper[4699]: I1122 04:25:30.752542 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c1137043-00d5-43bd-a4e1-6cdc2c17fb88-ovsdbserver-nb\") pod \"c1137043-00d5-43bd-a4e1-6cdc2c17fb88\" (UID: \"c1137043-00d5-43bd-a4e1-6cdc2c17fb88\") " Nov 22 04:25:30 crc kubenswrapper[4699]: I1122 04:25:30.752624 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1137043-00d5-43bd-a4e1-6cdc2c17fb88-config\") pod \"c1137043-00d5-43bd-a4e1-6cdc2c17fb88\" (UID: \"c1137043-00d5-43bd-a4e1-6cdc2c17fb88\") " Nov 22 04:25:30 crc kubenswrapper[4699]: I1122 04:25:30.752709 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1137043-00d5-43bd-a4e1-6cdc2c17fb88-dns-svc\") pod \"c1137043-00d5-43bd-a4e1-6cdc2c17fb88\" (UID: \"c1137043-00d5-43bd-a4e1-6cdc2c17fb88\") " Nov 22 04:25:30 crc kubenswrapper[4699]: I1122 04:25:30.752759 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8s5jb\" (UniqueName: \"kubernetes.io/projected/c1137043-00d5-43bd-a4e1-6cdc2c17fb88-kube-api-access-8s5jb\") pod \"c1137043-00d5-43bd-a4e1-6cdc2c17fb88\" (UID: \"c1137043-00d5-43bd-a4e1-6cdc2c17fb88\") " Nov 22 04:25:30 crc kubenswrapper[4699]: I1122 04:25:30.761234 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1137043-00d5-43bd-a4e1-6cdc2c17fb88-kube-api-access-8s5jb" (OuterVolumeSpecName: "kube-api-access-8s5jb") pod "c1137043-00d5-43bd-a4e1-6cdc2c17fb88" (UID: "c1137043-00d5-43bd-a4e1-6cdc2c17fb88"). InnerVolumeSpecName "kube-api-access-8s5jb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:25:30 crc kubenswrapper[4699]: I1122 04:25:30.809838 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1137043-00d5-43bd-a4e1-6cdc2c17fb88-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c1137043-00d5-43bd-a4e1-6cdc2c17fb88" (UID: "c1137043-00d5-43bd-a4e1-6cdc2c17fb88"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:25:30 crc kubenswrapper[4699]: I1122 04:25:30.825525 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1137043-00d5-43bd-a4e1-6cdc2c17fb88-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c1137043-00d5-43bd-a4e1-6cdc2c17fb88" (UID: "c1137043-00d5-43bd-a4e1-6cdc2c17fb88"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:25:30 crc kubenswrapper[4699]: I1122 04:25:30.825540 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1137043-00d5-43bd-a4e1-6cdc2c17fb88-config" (OuterVolumeSpecName: "config") pod "c1137043-00d5-43bd-a4e1-6cdc2c17fb88" (UID: "c1137043-00d5-43bd-a4e1-6cdc2c17fb88"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:25:30 crc kubenswrapper[4699]: I1122 04:25:30.855326 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8s5jb\" (UniqueName: \"kubernetes.io/projected/c1137043-00d5-43bd-a4e1-6cdc2c17fb88-kube-api-access-8s5jb\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:30 crc kubenswrapper[4699]: I1122 04:25:30.855366 4699 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c1137043-00d5-43bd-a4e1-6cdc2c17fb88-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:30 crc kubenswrapper[4699]: I1122 04:25:30.855378 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1137043-00d5-43bd-a4e1-6cdc2c17fb88-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:30 crc kubenswrapper[4699]: I1122 04:25:30.855388 4699 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1137043-00d5-43bd-a4e1-6cdc2c17fb88-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:31 crc kubenswrapper[4699]: I1122 04:25:31.348373 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-ztht2" event={"ID":"c1137043-00d5-43bd-a4e1-6cdc2c17fb88","Type":"ContainerDied","Data":"fe5795743758b2b5bf651eef0434d985a51b20ba6063c88d90bdfd53bc00c58e"} Nov 22 04:25:31 crc kubenswrapper[4699]: I1122 04:25:31.348402 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-ztht2" Nov 22 04:25:31 crc kubenswrapper[4699]: I1122 04:25:31.348423 4699 scope.go:117] "RemoveContainer" containerID="36b603b3480a668c4237d82d61ed45657b30f48df00cdbf4ba61b5431828ae34" Nov 22 04:25:31 crc kubenswrapper[4699]: I1122 04:25:31.352953 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-bdb7g" event={"ID":"5150cdc8-40db-4421-bbcd-16213ce14b2e","Type":"ContainerStarted","Data":"522cfd2e77f901493bb0502a1be8ae64f66af845bbc9f8a50c9da6781070b0c1"} Nov 22 04:25:31 crc kubenswrapper[4699]: I1122 04:25:31.380081 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-ztht2"] Nov 22 04:25:31 crc kubenswrapper[4699]: I1122 04:25:31.385340 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-ztht2"] Nov 22 04:25:31 crc kubenswrapper[4699]: I1122 04:25:31.458749 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1137043-00d5-43bd-a4e1-6cdc2c17fb88" path="/var/lib/kubelet/pods/c1137043-00d5-43bd-a4e1-6cdc2c17fb88/volumes" Nov 22 04:25:31 crc kubenswrapper[4699]: I1122 04:25:31.669962 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ff24634a-9171-4a5c-b045-4c653e032c18-etc-swift\") pod \"swift-storage-0\" (UID: \"ff24634a-9171-4a5c-b045-4c653e032c18\") " pod="openstack/swift-storage-0" Nov 22 04:25:31 crc kubenswrapper[4699]: E1122 04:25:31.670106 4699 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 22 04:25:31 crc kubenswrapper[4699]: E1122 04:25:31.670327 4699 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 22 04:25:31 crc kubenswrapper[4699]: E1122 04:25:31.670402 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ff24634a-9171-4a5c-b045-4c653e032c18-etc-swift podName:ff24634a-9171-4a5c-b045-4c653e032c18 nodeName:}" failed. No retries permitted until 2025-11-22 04:25:33.670378172 +0000 UTC m=+1085.012999369 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ff24634a-9171-4a5c-b045-4c653e032c18-etc-swift") pod "swift-storage-0" (UID: "ff24634a-9171-4a5c-b045-4c653e032c18") : configmap "swift-ring-files" not found Nov 22 04:25:32 crc kubenswrapper[4699]: I1122 04:25:32.361568 4699 generic.go:334] "Generic (PLEG): container finished" podID="5150cdc8-40db-4421-bbcd-16213ce14b2e" containerID="522cfd2e77f901493bb0502a1be8ae64f66af845bbc9f8a50c9da6781070b0c1" exitCode=0 Nov 22 04:25:32 crc kubenswrapper[4699]: I1122 04:25:32.361794 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-bdb7g" event={"ID":"5150cdc8-40db-4421-bbcd-16213ce14b2e","Type":"ContainerDied","Data":"522cfd2e77f901493bb0502a1be8ae64f66af845bbc9f8a50c9da6781070b0c1"} Nov 22 04:25:33 crc kubenswrapper[4699]: I1122 04:25:33.724505 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ff24634a-9171-4a5c-b045-4c653e032c18-etc-swift\") pod \"swift-storage-0\" (UID: \"ff24634a-9171-4a5c-b045-4c653e032c18\") " pod="openstack/swift-storage-0" Nov 22 04:25:33 crc kubenswrapper[4699]: E1122 04:25:33.724756 4699 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 22 04:25:33 crc kubenswrapper[4699]: E1122 04:25:33.724789 4699 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 22 04:25:33 crc kubenswrapper[4699]: E1122 04:25:33.724872 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ff24634a-9171-4a5c-b045-4c653e032c18-etc-swift podName:ff24634a-9171-4a5c-b045-4c653e032c18 nodeName:}" failed. No retries permitted until 2025-11-22 04:25:37.724851739 +0000 UTC m=+1089.067472936 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ff24634a-9171-4a5c-b045-4c653e032c18-etc-swift") pod "swift-storage-0" (UID: "ff24634a-9171-4a5c-b045-4c653e032c18") : configmap "swift-ring-files" not found Nov 22 04:25:33 crc kubenswrapper[4699]: I1122 04:25:33.862503 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-w4rqf"] Nov 22 04:25:33 crc kubenswrapper[4699]: E1122 04:25:33.862845 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1137043-00d5-43bd-a4e1-6cdc2c17fb88" containerName="dnsmasq-dns" Nov 22 04:25:33 crc kubenswrapper[4699]: I1122 04:25:33.862862 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1137043-00d5-43bd-a4e1-6cdc2c17fb88" containerName="dnsmasq-dns" Nov 22 04:25:33 crc kubenswrapper[4699]: E1122 04:25:33.862872 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1137043-00d5-43bd-a4e1-6cdc2c17fb88" containerName="init" Nov 22 04:25:33 crc kubenswrapper[4699]: I1122 04:25:33.862878 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1137043-00d5-43bd-a4e1-6cdc2c17fb88" containerName="init" Nov 22 04:25:33 crc kubenswrapper[4699]: I1122 04:25:33.863025 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1137043-00d5-43bd-a4e1-6cdc2c17fb88" containerName="dnsmasq-dns" Nov 22 04:25:33 crc kubenswrapper[4699]: I1122 04:25:33.863547 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-w4rqf" Nov 22 04:25:33 crc kubenswrapper[4699]: I1122 04:25:33.870017 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Nov 22 04:25:33 crc kubenswrapper[4699]: I1122 04:25:33.870065 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Nov 22 04:25:33 crc kubenswrapper[4699]: I1122 04:25:33.871393 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Nov 22 04:25:33 crc kubenswrapper[4699]: I1122 04:25:33.908827 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-t9rdp"] Nov 22 04:25:33 crc kubenswrapper[4699]: I1122 04:25:33.910052 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-t9rdp" Nov 22 04:25:33 crc kubenswrapper[4699]: I1122 04:25:33.917499 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-w4rqf"] Nov 22 04:25:33 crc kubenswrapper[4699]: E1122 04:25:33.918113 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-fn9h7 ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-fn9h7 ring-data-devices scripts swiftconf]: context canceled" pod="openstack/swift-ring-rebalance-w4rqf" podUID="5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50" Nov 22 04:25:33 crc kubenswrapper[4699]: I1122 04:25:33.926059 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-t9rdp"] Nov 22 04:25:33 crc kubenswrapper[4699]: I1122 04:25:33.931773 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-w4rqf"] Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.028418 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50-scripts\") pod \"swift-ring-rebalance-w4rqf\" (UID: \"5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50\") " pod="openstack/swift-ring-rebalance-w4rqf" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.028488 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ed96c0b0-7b76-4f03-b352-461405bbfb23-ring-data-devices\") pod \"swift-ring-rebalance-t9rdp\" (UID: \"ed96c0b0-7b76-4f03-b352-461405bbfb23\") " pod="openstack/swift-ring-rebalance-t9rdp" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.028512 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fn9h7\" (UniqueName: \"kubernetes.io/projected/5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50-kube-api-access-fn9h7\") pod \"swift-ring-rebalance-w4rqf\" (UID: \"5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50\") " pod="openstack/swift-ring-rebalance-w4rqf" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.028537 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ed96c0b0-7b76-4f03-b352-461405bbfb23-etc-swift\") pod \"swift-ring-rebalance-t9rdp\" (UID: \"ed96c0b0-7b76-4f03-b352-461405bbfb23\") " pod="openstack/swift-ring-rebalance-t9rdp" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.028562 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ed96c0b0-7b76-4f03-b352-461405bbfb23-dispersionconf\") pod \"swift-ring-rebalance-t9rdp\" (UID: \"ed96c0b0-7b76-4f03-b352-461405bbfb23\") " pod="openstack/swift-ring-rebalance-t9rdp" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.028583 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50-combined-ca-bundle\") pod \"swift-ring-rebalance-w4rqf\" (UID: \"5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50\") " pod="openstack/swift-ring-rebalance-w4rqf" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.028610 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50-etc-swift\") pod \"swift-ring-rebalance-w4rqf\" (UID: \"5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50\") " pod="openstack/swift-ring-rebalance-w4rqf" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.028626 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50-dispersionconf\") pod \"swift-ring-rebalance-w4rqf\" (UID: \"5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50\") " pod="openstack/swift-ring-rebalance-w4rqf" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.028657 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ed96c0b0-7b76-4f03-b352-461405bbfb23-swiftconf\") pod \"swift-ring-rebalance-t9rdp\" (UID: \"ed96c0b0-7b76-4f03-b352-461405bbfb23\") " pod="openstack/swift-ring-rebalance-t9rdp" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.028681 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed96c0b0-7b76-4f03-b352-461405bbfb23-combined-ca-bundle\") pod \"swift-ring-rebalance-t9rdp\" (UID: \"ed96c0b0-7b76-4f03-b352-461405bbfb23\") " pod="openstack/swift-ring-rebalance-t9rdp" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.028697 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50-ring-data-devices\") pod \"swift-ring-rebalance-w4rqf\" (UID: \"5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50\") " pod="openstack/swift-ring-rebalance-w4rqf" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.028717 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ed96c0b0-7b76-4f03-b352-461405bbfb23-scripts\") pod \"swift-ring-rebalance-t9rdp\" (UID: \"ed96c0b0-7b76-4f03-b352-461405bbfb23\") " pod="openstack/swift-ring-rebalance-t9rdp" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.028733 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp7zb\" (UniqueName: \"kubernetes.io/projected/ed96c0b0-7b76-4f03-b352-461405bbfb23-kube-api-access-lp7zb\") pod \"swift-ring-rebalance-t9rdp\" (UID: \"ed96c0b0-7b76-4f03-b352-461405bbfb23\") " pod="openstack/swift-ring-rebalance-t9rdp" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.028752 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50-swiftconf\") pod \"swift-ring-rebalance-w4rqf\" (UID: \"5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50\") " pod="openstack/swift-ring-rebalance-w4rqf" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.130641 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fn9h7\" (UniqueName: \"kubernetes.io/projected/5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50-kube-api-access-fn9h7\") pod \"swift-ring-rebalance-w4rqf\" (UID: \"5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50\") " pod="openstack/swift-ring-rebalance-w4rqf" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.130733 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ed96c0b0-7b76-4f03-b352-461405bbfb23-etc-swift\") pod \"swift-ring-rebalance-t9rdp\" (UID: \"ed96c0b0-7b76-4f03-b352-461405bbfb23\") " pod="openstack/swift-ring-rebalance-t9rdp" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.130777 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ed96c0b0-7b76-4f03-b352-461405bbfb23-dispersionconf\") pod \"swift-ring-rebalance-t9rdp\" (UID: \"ed96c0b0-7b76-4f03-b352-461405bbfb23\") " pod="openstack/swift-ring-rebalance-t9rdp" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.130811 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50-combined-ca-bundle\") pod \"swift-ring-rebalance-w4rqf\" (UID: \"5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50\") " pod="openstack/swift-ring-rebalance-w4rqf" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.130862 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50-etc-swift\") pod \"swift-ring-rebalance-w4rqf\" (UID: \"5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50\") " pod="openstack/swift-ring-rebalance-w4rqf" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.130891 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50-dispersionconf\") pod \"swift-ring-rebalance-w4rqf\" (UID: \"5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50\") " pod="openstack/swift-ring-rebalance-w4rqf" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.130944 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ed96c0b0-7b76-4f03-b352-461405bbfb23-swiftconf\") pod \"swift-ring-rebalance-t9rdp\" (UID: \"ed96c0b0-7b76-4f03-b352-461405bbfb23\") " pod="openstack/swift-ring-rebalance-t9rdp" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.130978 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed96c0b0-7b76-4f03-b352-461405bbfb23-combined-ca-bundle\") pod \"swift-ring-rebalance-t9rdp\" (UID: \"ed96c0b0-7b76-4f03-b352-461405bbfb23\") " pod="openstack/swift-ring-rebalance-t9rdp" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.131005 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50-ring-data-devices\") pod \"swift-ring-rebalance-w4rqf\" (UID: \"5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50\") " pod="openstack/swift-ring-rebalance-w4rqf" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.131038 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ed96c0b0-7b76-4f03-b352-461405bbfb23-scripts\") pod \"swift-ring-rebalance-t9rdp\" (UID: \"ed96c0b0-7b76-4f03-b352-461405bbfb23\") " pod="openstack/swift-ring-rebalance-t9rdp" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.131069 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lp7zb\" (UniqueName: \"kubernetes.io/projected/ed96c0b0-7b76-4f03-b352-461405bbfb23-kube-api-access-lp7zb\") pod \"swift-ring-rebalance-t9rdp\" (UID: \"ed96c0b0-7b76-4f03-b352-461405bbfb23\") " pod="openstack/swift-ring-rebalance-t9rdp" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.131095 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50-swiftconf\") pod \"swift-ring-rebalance-w4rqf\" (UID: \"5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50\") " pod="openstack/swift-ring-rebalance-w4rqf" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.131215 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50-scripts\") pod \"swift-ring-rebalance-w4rqf\" (UID: \"5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50\") " pod="openstack/swift-ring-rebalance-w4rqf" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.131241 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ed96c0b0-7b76-4f03-b352-461405bbfb23-ring-data-devices\") pod \"swift-ring-rebalance-t9rdp\" (UID: \"ed96c0b0-7b76-4f03-b352-461405bbfb23\") " pod="openstack/swift-ring-rebalance-t9rdp" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.131421 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ed96c0b0-7b76-4f03-b352-461405bbfb23-etc-swift\") pod \"swift-ring-rebalance-t9rdp\" (UID: \"ed96c0b0-7b76-4f03-b352-461405bbfb23\") " pod="openstack/swift-ring-rebalance-t9rdp" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.131822 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50-etc-swift\") pod \"swift-ring-rebalance-w4rqf\" (UID: \"5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50\") " pod="openstack/swift-ring-rebalance-w4rqf" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.131970 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ed96c0b0-7b76-4f03-b352-461405bbfb23-ring-data-devices\") pod \"swift-ring-rebalance-t9rdp\" (UID: \"ed96c0b0-7b76-4f03-b352-461405bbfb23\") " pod="openstack/swift-ring-rebalance-t9rdp" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.133181 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50-ring-data-devices\") pod \"swift-ring-rebalance-w4rqf\" (UID: \"5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50\") " pod="openstack/swift-ring-rebalance-w4rqf" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.133328 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ed96c0b0-7b76-4f03-b352-461405bbfb23-scripts\") pod \"swift-ring-rebalance-t9rdp\" (UID: \"ed96c0b0-7b76-4f03-b352-461405bbfb23\") " pod="openstack/swift-ring-rebalance-t9rdp" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.133704 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50-scripts\") pod \"swift-ring-rebalance-w4rqf\" (UID: \"5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50\") " pod="openstack/swift-ring-rebalance-w4rqf" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.137951 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ed96c0b0-7b76-4f03-b352-461405bbfb23-dispersionconf\") pod \"swift-ring-rebalance-t9rdp\" (UID: \"ed96c0b0-7b76-4f03-b352-461405bbfb23\") " pod="openstack/swift-ring-rebalance-t9rdp" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.138051 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed96c0b0-7b76-4f03-b352-461405bbfb23-combined-ca-bundle\") pod \"swift-ring-rebalance-t9rdp\" (UID: \"ed96c0b0-7b76-4f03-b352-461405bbfb23\") " pod="openstack/swift-ring-rebalance-t9rdp" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.139935 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50-dispersionconf\") pod \"swift-ring-rebalance-w4rqf\" (UID: \"5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50\") " pod="openstack/swift-ring-rebalance-w4rqf" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.140949 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50-combined-ca-bundle\") pod \"swift-ring-rebalance-w4rqf\" (UID: \"5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50\") " pod="openstack/swift-ring-rebalance-w4rqf" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.145882 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ed96c0b0-7b76-4f03-b352-461405bbfb23-swiftconf\") pod \"swift-ring-rebalance-t9rdp\" (UID: \"ed96c0b0-7b76-4f03-b352-461405bbfb23\") " pod="openstack/swift-ring-rebalance-t9rdp" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.150640 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50-swiftconf\") pod \"swift-ring-rebalance-w4rqf\" (UID: \"5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50\") " pod="openstack/swift-ring-rebalance-w4rqf" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.153110 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fn9h7\" (UniqueName: \"kubernetes.io/projected/5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50-kube-api-access-fn9h7\") pod \"swift-ring-rebalance-w4rqf\" (UID: \"5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50\") " pod="openstack/swift-ring-rebalance-w4rqf" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.153627 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp7zb\" (UniqueName: \"kubernetes.io/projected/ed96c0b0-7b76-4f03-b352-461405bbfb23-kube-api-access-lp7zb\") pod \"swift-ring-rebalance-t9rdp\" (UID: \"ed96c0b0-7b76-4f03-b352-461405bbfb23\") " pod="openstack/swift-ring-rebalance-t9rdp" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.225499 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-t9rdp" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.374690 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-w4rqf" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.388467 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-w4rqf" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.536206 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fn9h7\" (UniqueName: \"kubernetes.io/projected/5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50-kube-api-access-fn9h7\") pod \"5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50\" (UID: \"5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50\") " Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.536628 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50-scripts\") pod \"5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50\" (UID: \"5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50\") " Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.536687 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50-combined-ca-bundle\") pod \"5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50\" (UID: \"5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50\") " Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.536737 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50-dispersionconf\") pod \"5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50\" (UID: \"5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50\") " Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.536792 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50-swiftconf\") pod \"5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50\" (UID: \"5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50\") " Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.536829 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50-etc-swift\") pod \"5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50\" (UID: \"5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50\") " Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.536914 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50-ring-data-devices\") pod \"5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50\" (UID: \"5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50\") " Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.537313 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50-scripts" (OuterVolumeSpecName: "scripts") pod "5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50" (UID: "5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.537785 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50" (UID: "5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.537953 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.538014 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50" (UID: "5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.542184 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50" (UID: "5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.542703 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50" (UID: "5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.542783 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50-kube-api-access-fn9h7" (OuterVolumeSpecName: "kube-api-access-fn9h7") pod "5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50" (UID: "5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50"). InnerVolumeSpecName "kube-api-access-fn9h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.544048 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50" (UID: "5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.639635 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fn9h7\" (UniqueName: \"kubernetes.io/projected/5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50-kube-api-access-fn9h7\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.639952 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.640022 4699 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50-dispersionconf\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.640106 4699 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50-swiftconf\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.640195 4699 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50-etc-swift\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:34 crc kubenswrapper[4699]: I1122 04:25:34.640301 4699 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50-ring-data-devices\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:35 crc kubenswrapper[4699]: I1122 04:25:35.298982 4699 scope.go:117] "RemoveContainer" containerID="5e65f71829f916ee81698a9e6f1b554bc1301d8c00dfbac757e93e6ec7f6943e" Nov 22 04:25:35 crc kubenswrapper[4699]: I1122 04:25:35.405200 4699 generic.go:334] "Generic (PLEG): container finished" podID="9e3fa899-c823-4cab-8224-1ca3130f515a" containerID="315df20183e7724f4f44f825f46816d1c0546d85f93272ae82319dcc710d1f28" exitCode=0 Nov 22 04:25:35 crc kubenswrapper[4699]: I1122 04:25:35.405344 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-p6fhv" event={"ID":"9e3fa899-c823-4cab-8224-1ca3130f515a","Type":"ContainerDied","Data":"315df20183e7724f4f44f825f46816d1c0546d85f93272ae82319dcc710d1f28"} Nov 22 04:25:35 crc kubenswrapper[4699]: I1122 04:25:35.413935 4699 generic.go:334] "Generic (PLEG): container finished" podID="c99675e1-93f6-4b73-b4cb-e8f096c3c16e" containerID="8b6ba740e45459f2bdbb9721045729d9604eddcf85137656d66d2083a581dbfb" exitCode=0 Nov 22 04:25:35 crc kubenswrapper[4699]: I1122 04:25:35.414015 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1642-account-create-ht5w7" event={"ID":"c99675e1-93f6-4b73-b4cb-e8f096c3c16e","Type":"ContainerDied","Data":"8b6ba740e45459f2bdbb9721045729d9604eddcf85137656d66d2083a581dbfb"} Nov 22 04:25:35 crc kubenswrapper[4699]: I1122 04:25:35.416749 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-w4rqf" Nov 22 04:25:35 crc kubenswrapper[4699]: I1122 04:25:35.656389 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-t9rdp"] Nov 22 04:25:35 crc kubenswrapper[4699]: W1122 04:25:35.704788 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded96c0b0_7b76_4f03_b352_461405bbfb23.slice/crio-25705f511b40cf6291aaccc4eea5af1e5a9f8cd53b1df1abfe2f62335438b39e WatchSource:0}: Error finding container 25705f511b40cf6291aaccc4eea5af1e5a9f8cd53b1df1abfe2f62335438b39e: Status 404 returned error can't find the container with id 25705f511b40cf6291aaccc4eea5af1e5a9f8cd53b1df1abfe2f62335438b39e Nov 22 04:25:36 crc kubenswrapper[4699]: I1122 04:25:36.428305 4699 generic.go:334] "Generic (PLEG): container finished" podID="73de98eb-db4a-47f1-b23a-aa38b2db9078" containerID="03cac1d004eeb36b5f78d4c545557cca1039054d9489fe9752da173743c5aaf6" exitCode=0 Nov 22 04:25:36 crc kubenswrapper[4699]: I1122 04:25:36.428407 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-ck2mh" event={"ID":"73de98eb-db4a-47f1-b23a-aa38b2db9078","Type":"ContainerDied","Data":"03cac1d004eeb36b5f78d4c545557cca1039054d9489fe9752da173743c5aaf6"} Nov 22 04:25:36 crc kubenswrapper[4699]: I1122 04:25:36.435925 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6022714c-eabe-49a9-b794-0b7a0097b816","Type":"ContainerStarted","Data":"c86c04755a51c930f754ddaa6a2476535c9a394d7ba3dc39c3bfd1559dc0ffa2"} Nov 22 04:25:36 crc kubenswrapper[4699]: I1122 04:25:36.436034 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6022714c-eabe-49a9-b794-0b7a0097b816","Type":"ContainerStarted","Data":"97bb916fa4e28d88dc0e83d8c28646ddc815c96c2bfd046cf8751049766a8de7"} Nov 22 04:25:36 crc kubenswrapper[4699]: I1122 04:25:36.437560 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Nov 22 04:25:36 crc kubenswrapper[4699]: I1122 04:25:36.441643 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-t9rdp" event={"ID":"ed96c0b0-7b76-4f03-b352-461405bbfb23","Type":"ContainerStarted","Data":"25705f511b40cf6291aaccc4eea5af1e5a9f8cd53b1df1abfe2f62335438b39e"} Nov 22 04:25:36 crc kubenswrapper[4699]: I1122 04:25:36.444231 4699 generic.go:334] "Generic (PLEG): container finished" podID="fe7d25aa-4c77-48b6-88fe-11339dbca63a" containerID="50684eda9651e67a2aa904aa2ecb7f8fd420565e07461db74e9efca2b0530472" exitCode=0 Nov 22 04:25:36 crc kubenswrapper[4699]: I1122 04:25:36.444246 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-mnjhh" event={"ID":"fe7d25aa-4c77-48b6-88fe-11339dbca63a","Type":"ContainerDied","Data":"50684eda9651e67a2aa904aa2ecb7f8fd420565e07461db74e9efca2b0530472"} Nov 22 04:25:36 crc kubenswrapper[4699]: I1122 04:25:36.445821 4699 generic.go:334] "Generic (PLEG): container finished" podID="8070964f-baae-4437-b0b7-2ff91608f0d7" containerID="4c2ddb86338033d36a854e6a597e4f32f3f958d00f97a4e665fe3ed6fb5b5942" exitCode=0 Nov 22 04:25:36 crc kubenswrapper[4699]: I1122 04:25:36.445891 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f2d9-account-create-9fnts" event={"ID":"8070964f-baae-4437-b0b7-2ff91608f0d7","Type":"ContainerDied","Data":"4c2ddb86338033d36a854e6a597e4f32f3f958d00f97a4e665fe3ed6fb5b5942"} Nov 22 04:25:36 crc kubenswrapper[4699]: I1122 04:25:36.448845 4699 generic.go:334] "Generic (PLEG): container finished" podID="8afe223c-55c0-40b3-aa14-ea52cad6bccc" containerID="c34babc09fa972896b17116ea685f90d6a7b38dce92f5a6f8a3fd517e7d2fa98" exitCode=0 Nov 22 04:25:36 crc kubenswrapper[4699]: I1122 04:25:36.448890 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-81ad-account-create-jflt8" event={"ID":"8afe223c-55c0-40b3-aa14-ea52cad6bccc","Type":"ContainerDied","Data":"c34babc09fa972896b17116ea685f90d6a7b38dce92f5a6f8a3fd517e7d2fa98"} Nov 22 04:25:36 crc kubenswrapper[4699]: I1122 04:25:36.457756 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-bdb7g" event={"ID":"5150cdc8-40db-4421-bbcd-16213ce14b2e","Type":"ContainerStarted","Data":"22cbc53f0e4539ab600ee81cc3fc691c24f4f65d650e4b1a822c0d12ea4098a1"} Nov 22 04:25:36 crc kubenswrapper[4699]: I1122 04:25:36.457942 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-bdb7g" Nov 22 04:25:36 crc kubenswrapper[4699]: I1122 04:25:36.509193 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.547797205 podStartE2EDuration="13.509175909s" podCreationTimestamp="2025-11-22 04:25:23 +0000 UTC" firstStartedPulling="2025-11-22 04:25:24.402634039 +0000 UTC m=+1075.745255226" lastFinishedPulling="2025-11-22 04:25:35.364012743 +0000 UTC m=+1086.706633930" observedRunningTime="2025-11-22 04:25:36.492854954 +0000 UTC m=+1087.835476141" watchObservedRunningTime="2025-11-22 04:25:36.509175909 +0000 UTC m=+1087.851797096" Nov 22 04:25:36 crc kubenswrapper[4699]: I1122 04:25:36.821498 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-p6fhv" Nov 22 04:25:36 crc kubenswrapper[4699]: I1122 04:25:36.848002 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-bdb7g" podStartSLOduration=8.847985263 podStartE2EDuration="8.847985263s" podCreationTimestamp="2025-11-22 04:25:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:25:36.564822265 +0000 UTC m=+1087.907443462" watchObservedRunningTime="2025-11-22 04:25:36.847985263 +0000 UTC m=+1088.190606440" Nov 22 04:25:36 crc kubenswrapper[4699]: I1122 04:25:36.936080 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1642-account-create-ht5w7" Nov 22 04:25:37 crc kubenswrapper[4699]: I1122 04:25:37.007530 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e3fa899-c823-4cab-8224-1ca3130f515a-operator-scripts\") pod \"9e3fa899-c823-4cab-8224-1ca3130f515a\" (UID: \"9e3fa899-c823-4cab-8224-1ca3130f515a\") " Nov 22 04:25:37 crc kubenswrapper[4699]: I1122 04:25:37.007613 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6ppt\" (UniqueName: \"kubernetes.io/projected/9e3fa899-c823-4cab-8224-1ca3130f515a-kube-api-access-p6ppt\") pod \"9e3fa899-c823-4cab-8224-1ca3130f515a\" (UID: \"9e3fa899-c823-4cab-8224-1ca3130f515a\") " Nov 22 04:25:37 crc kubenswrapper[4699]: I1122 04:25:37.008226 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e3fa899-c823-4cab-8224-1ca3130f515a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9e3fa899-c823-4cab-8224-1ca3130f515a" (UID: "9e3fa899-c823-4cab-8224-1ca3130f515a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:25:37 crc kubenswrapper[4699]: I1122 04:25:37.013270 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e3fa899-c823-4cab-8224-1ca3130f515a-kube-api-access-p6ppt" (OuterVolumeSpecName: "kube-api-access-p6ppt") pod "9e3fa899-c823-4cab-8224-1ca3130f515a" (UID: "9e3fa899-c823-4cab-8224-1ca3130f515a"). InnerVolumeSpecName "kube-api-access-p6ppt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:25:37 crc kubenswrapper[4699]: I1122 04:25:37.109198 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c99675e1-93f6-4b73-b4cb-e8f096c3c16e-operator-scripts\") pod \"c99675e1-93f6-4b73-b4cb-e8f096c3c16e\" (UID: \"c99675e1-93f6-4b73-b4cb-e8f096c3c16e\") " Nov 22 04:25:37 crc kubenswrapper[4699]: I1122 04:25:37.109301 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbg4h\" (UniqueName: \"kubernetes.io/projected/c99675e1-93f6-4b73-b4cb-e8f096c3c16e-kube-api-access-lbg4h\") pod \"c99675e1-93f6-4b73-b4cb-e8f096c3c16e\" (UID: \"c99675e1-93f6-4b73-b4cb-e8f096c3c16e\") " Nov 22 04:25:37 crc kubenswrapper[4699]: I1122 04:25:37.109867 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6ppt\" (UniqueName: \"kubernetes.io/projected/9e3fa899-c823-4cab-8224-1ca3130f515a-kube-api-access-p6ppt\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:37 crc kubenswrapper[4699]: I1122 04:25:37.109895 4699 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e3fa899-c823-4cab-8224-1ca3130f515a-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:37 crc kubenswrapper[4699]: I1122 04:25:37.109925 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c99675e1-93f6-4b73-b4cb-e8f096c3c16e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c99675e1-93f6-4b73-b4cb-e8f096c3c16e" (UID: "c99675e1-93f6-4b73-b4cb-e8f096c3c16e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:25:37 crc kubenswrapper[4699]: I1122 04:25:37.112573 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c99675e1-93f6-4b73-b4cb-e8f096c3c16e-kube-api-access-lbg4h" (OuterVolumeSpecName: "kube-api-access-lbg4h") pod "c99675e1-93f6-4b73-b4cb-e8f096c3c16e" (UID: "c99675e1-93f6-4b73-b4cb-e8f096c3c16e"). InnerVolumeSpecName "kube-api-access-lbg4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:25:37 crc kubenswrapper[4699]: I1122 04:25:37.212144 4699 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c99675e1-93f6-4b73-b4cb-e8f096c3c16e-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:37 crc kubenswrapper[4699]: I1122 04:25:37.212208 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbg4h\" (UniqueName: \"kubernetes.io/projected/c99675e1-93f6-4b73-b4cb-e8f096c3c16e-kube-api-access-lbg4h\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:37 crc kubenswrapper[4699]: I1122 04:25:37.249878 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Nov 22 04:25:37 crc kubenswrapper[4699]: I1122 04:25:37.328567 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Nov 22 04:25:37 crc kubenswrapper[4699]: I1122 04:25:37.483326 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-p6fhv" Nov 22 04:25:37 crc kubenswrapper[4699]: I1122 04:25:37.484034 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-p6fhv" event={"ID":"9e3fa899-c823-4cab-8224-1ca3130f515a","Type":"ContainerDied","Data":"3d4ce34c365c691a487edf56de2f6aed819c6a35ed2ccf02231f29422a2b37ce"} Nov 22 04:25:37 crc kubenswrapper[4699]: I1122 04:25:37.484060 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d4ce34c365c691a487edf56de2f6aed819c6a35ed2ccf02231f29422a2b37ce" Nov 22 04:25:37 crc kubenswrapper[4699]: I1122 04:25:37.486884 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1642-account-create-ht5w7" Nov 22 04:25:37 crc kubenswrapper[4699]: I1122 04:25:37.487313 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1642-account-create-ht5w7" event={"ID":"c99675e1-93f6-4b73-b4cb-e8f096c3c16e","Type":"ContainerDied","Data":"b938e23aa7a2692ac4a03ab3e4869276dcde61d97b9cc6b36686f0f0e20ce3a5"} Nov 22 04:25:37 crc kubenswrapper[4699]: I1122 04:25:37.487339 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b938e23aa7a2692ac4a03ab3e4869276dcde61d97b9cc6b36686f0f0e20ce3a5" Nov 22 04:25:37 crc kubenswrapper[4699]: I1122 04:25:37.827926 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ff24634a-9171-4a5c-b045-4c653e032c18-etc-swift\") pod \"swift-storage-0\" (UID: \"ff24634a-9171-4a5c-b045-4c653e032c18\") " pod="openstack/swift-storage-0" Nov 22 04:25:37 crc kubenswrapper[4699]: E1122 04:25:37.828300 4699 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 22 04:25:37 crc kubenswrapper[4699]: E1122 04:25:37.828331 4699 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 22 04:25:37 crc kubenswrapper[4699]: E1122 04:25:37.828413 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ff24634a-9171-4a5c-b045-4c653e032c18-etc-swift podName:ff24634a-9171-4a5c-b045-4c653e032c18 nodeName:}" failed. No retries permitted until 2025-11-22 04:25:45.828389075 +0000 UTC m=+1097.171010272 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ff24634a-9171-4a5c-b045-4c653e032c18-etc-swift") pod "swift-storage-0" (UID: "ff24634a-9171-4a5c-b045-4c653e032c18") : configmap "swift-ring-files" not found Nov 22 04:25:38 crc kubenswrapper[4699]: I1122 04:25:38.289274 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-s7mlz" podUID="0311366c-c8c7-449c-b617-213a4d87de00" containerName="ovn-controller" probeResult="failure" output=< Nov 22 04:25:38 crc kubenswrapper[4699]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Nov 22 04:25:38 crc kubenswrapper[4699]: > Nov 22 04:25:39 crc kubenswrapper[4699]: I1122 04:25:39.058295 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-ck2mh" Nov 22 04:25:39 crc kubenswrapper[4699]: I1122 04:25:39.149391 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73de98eb-db4a-47f1-b23a-aa38b2db9078-operator-scripts\") pod \"73de98eb-db4a-47f1-b23a-aa38b2db9078\" (UID: \"73de98eb-db4a-47f1-b23a-aa38b2db9078\") " Nov 22 04:25:39 crc kubenswrapper[4699]: I1122 04:25:39.149732 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lb649\" (UniqueName: \"kubernetes.io/projected/73de98eb-db4a-47f1-b23a-aa38b2db9078-kube-api-access-lb649\") pod \"73de98eb-db4a-47f1-b23a-aa38b2db9078\" (UID: \"73de98eb-db4a-47f1-b23a-aa38b2db9078\") " Nov 22 04:25:39 crc kubenswrapper[4699]: I1122 04:25:39.150004 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73de98eb-db4a-47f1-b23a-aa38b2db9078-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "73de98eb-db4a-47f1-b23a-aa38b2db9078" (UID: "73de98eb-db4a-47f1-b23a-aa38b2db9078"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:25:39 crc kubenswrapper[4699]: I1122 04:25:39.150215 4699 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73de98eb-db4a-47f1-b23a-aa38b2db9078-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:39 crc kubenswrapper[4699]: I1122 04:25:39.156576 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73de98eb-db4a-47f1-b23a-aa38b2db9078-kube-api-access-lb649" (OuterVolumeSpecName: "kube-api-access-lb649") pod "73de98eb-db4a-47f1-b23a-aa38b2db9078" (UID: "73de98eb-db4a-47f1-b23a-aa38b2db9078"). InnerVolumeSpecName "kube-api-access-lb649". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:25:39 crc kubenswrapper[4699]: I1122 04:25:39.252195 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lb649\" (UniqueName: \"kubernetes.io/projected/73de98eb-db4a-47f1-b23a-aa38b2db9078-kube-api-access-lb649\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:39 crc kubenswrapper[4699]: I1122 04:25:39.506146 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-ck2mh" event={"ID":"73de98eb-db4a-47f1-b23a-aa38b2db9078","Type":"ContainerDied","Data":"ea5c33e7a0e256de3c2af070fca67b02887e23e893da1e4b339b2ca2d5338684"} Nov 22 04:25:39 crc kubenswrapper[4699]: I1122 04:25:39.506188 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea5c33e7a0e256de3c2af070fca67b02887e23e893da1e4b339b2ca2d5338684" Nov 22 04:25:39 crc kubenswrapper[4699]: I1122 04:25:39.506227 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-ck2mh" Nov 22 04:25:40 crc kubenswrapper[4699]: I1122 04:25:40.241738 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-mnjhh" Nov 22 04:25:40 crc kubenswrapper[4699]: I1122 04:25:40.249835 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-81ad-account-create-jflt8" Nov 22 04:25:40 crc kubenswrapper[4699]: I1122 04:25:40.286891 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f2d9-account-create-9fnts" Nov 22 04:25:40 crc kubenswrapper[4699]: I1122 04:25:40.376782 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8afe223c-55c0-40b3-aa14-ea52cad6bccc-operator-scripts\") pod \"8afe223c-55c0-40b3-aa14-ea52cad6bccc\" (UID: \"8afe223c-55c0-40b3-aa14-ea52cad6bccc\") " Nov 22 04:25:40 crc kubenswrapper[4699]: I1122 04:25:40.376859 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe7d25aa-4c77-48b6-88fe-11339dbca63a-operator-scripts\") pod \"fe7d25aa-4c77-48b6-88fe-11339dbca63a\" (UID: \"fe7d25aa-4c77-48b6-88fe-11339dbca63a\") " Nov 22 04:25:40 crc kubenswrapper[4699]: I1122 04:25:40.376952 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87k6h\" (UniqueName: \"kubernetes.io/projected/fe7d25aa-4c77-48b6-88fe-11339dbca63a-kube-api-access-87k6h\") pod \"fe7d25aa-4c77-48b6-88fe-11339dbca63a\" (UID: \"fe7d25aa-4c77-48b6-88fe-11339dbca63a\") " Nov 22 04:25:40 crc kubenswrapper[4699]: I1122 04:25:40.377008 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqk42\" (UniqueName: \"kubernetes.io/projected/8070964f-baae-4437-b0b7-2ff91608f0d7-kube-api-access-vqk42\") pod \"8070964f-baae-4437-b0b7-2ff91608f0d7\" (UID: \"8070964f-baae-4437-b0b7-2ff91608f0d7\") " Nov 22 04:25:40 crc kubenswrapper[4699]: I1122 04:25:40.377057 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8070964f-baae-4437-b0b7-2ff91608f0d7-operator-scripts\") pod \"8070964f-baae-4437-b0b7-2ff91608f0d7\" (UID: \"8070964f-baae-4437-b0b7-2ff91608f0d7\") " Nov 22 04:25:40 crc kubenswrapper[4699]: I1122 04:25:40.377245 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpqpx\" (UniqueName: \"kubernetes.io/projected/8afe223c-55c0-40b3-aa14-ea52cad6bccc-kube-api-access-zpqpx\") pod \"8afe223c-55c0-40b3-aa14-ea52cad6bccc\" (UID: \"8afe223c-55c0-40b3-aa14-ea52cad6bccc\") " Nov 22 04:25:40 crc kubenswrapper[4699]: I1122 04:25:40.377384 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8afe223c-55c0-40b3-aa14-ea52cad6bccc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8afe223c-55c0-40b3-aa14-ea52cad6bccc" (UID: "8afe223c-55c0-40b3-aa14-ea52cad6bccc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:25:40 crc kubenswrapper[4699]: I1122 04:25:40.377426 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe7d25aa-4c77-48b6-88fe-11339dbca63a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fe7d25aa-4c77-48b6-88fe-11339dbca63a" (UID: "fe7d25aa-4c77-48b6-88fe-11339dbca63a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:25:40 crc kubenswrapper[4699]: I1122 04:25:40.377850 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8070964f-baae-4437-b0b7-2ff91608f0d7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8070964f-baae-4437-b0b7-2ff91608f0d7" (UID: "8070964f-baae-4437-b0b7-2ff91608f0d7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:25:40 crc kubenswrapper[4699]: I1122 04:25:40.378047 4699 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8afe223c-55c0-40b3-aa14-ea52cad6bccc-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:40 crc kubenswrapper[4699]: I1122 04:25:40.378081 4699 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe7d25aa-4c77-48b6-88fe-11339dbca63a-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:40 crc kubenswrapper[4699]: I1122 04:25:40.378095 4699 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8070964f-baae-4437-b0b7-2ff91608f0d7-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:40 crc kubenswrapper[4699]: I1122 04:25:40.380605 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe7d25aa-4c77-48b6-88fe-11339dbca63a-kube-api-access-87k6h" (OuterVolumeSpecName: "kube-api-access-87k6h") pod "fe7d25aa-4c77-48b6-88fe-11339dbca63a" (UID: "fe7d25aa-4c77-48b6-88fe-11339dbca63a"). InnerVolumeSpecName "kube-api-access-87k6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:25:40 crc kubenswrapper[4699]: I1122 04:25:40.380637 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8afe223c-55c0-40b3-aa14-ea52cad6bccc-kube-api-access-zpqpx" (OuterVolumeSpecName: "kube-api-access-zpqpx") pod "8afe223c-55c0-40b3-aa14-ea52cad6bccc" (UID: "8afe223c-55c0-40b3-aa14-ea52cad6bccc"). InnerVolumeSpecName "kube-api-access-zpqpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:25:40 crc kubenswrapper[4699]: I1122 04:25:40.382623 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8070964f-baae-4437-b0b7-2ff91608f0d7-kube-api-access-vqk42" (OuterVolumeSpecName: "kube-api-access-vqk42") pod "8070964f-baae-4437-b0b7-2ff91608f0d7" (UID: "8070964f-baae-4437-b0b7-2ff91608f0d7"). InnerVolumeSpecName "kube-api-access-vqk42". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:25:40 crc kubenswrapper[4699]: I1122 04:25:40.479989 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpqpx\" (UniqueName: \"kubernetes.io/projected/8afe223c-55c0-40b3-aa14-ea52cad6bccc-kube-api-access-zpqpx\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:40 crc kubenswrapper[4699]: I1122 04:25:40.480025 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87k6h\" (UniqueName: \"kubernetes.io/projected/fe7d25aa-4c77-48b6-88fe-11339dbca63a-kube-api-access-87k6h\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:40 crc kubenswrapper[4699]: I1122 04:25:40.480036 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqk42\" (UniqueName: \"kubernetes.io/projected/8070964f-baae-4437-b0b7-2ff91608f0d7-kube-api-access-vqk42\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:40 crc kubenswrapper[4699]: I1122 04:25:40.514462 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-81ad-account-create-jflt8" event={"ID":"8afe223c-55c0-40b3-aa14-ea52cad6bccc","Type":"ContainerDied","Data":"3463cc88d7a91338f6ab0e6ad3d1845fa66b535f58ee1396eda989333499fc43"} Nov 22 04:25:40 crc kubenswrapper[4699]: I1122 04:25:40.514519 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3463cc88d7a91338f6ab0e6ad3d1845fa66b535f58ee1396eda989333499fc43" Nov 22 04:25:40 crc kubenswrapper[4699]: I1122 04:25:40.514488 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-81ad-account-create-jflt8" Nov 22 04:25:40 crc kubenswrapper[4699]: I1122 04:25:40.515680 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-t9rdp" event={"ID":"ed96c0b0-7b76-4f03-b352-461405bbfb23","Type":"ContainerStarted","Data":"cafd0a208383aab3a5d237ef02dd639a43d3f32644a79d8b1f68ab04c4119f8d"} Nov 22 04:25:40 crc kubenswrapper[4699]: I1122 04:25:40.519617 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-mnjhh" Nov 22 04:25:40 crc kubenswrapper[4699]: I1122 04:25:40.519616 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-mnjhh" event={"ID":"fe7d25aa-4c77-48b6-88fe-11339dbca63a","Type":"ContainerDied","Data":"06c1a58a8d4e1eb48a3058172821e63b893287ceda6779dc4f9ac86036e7633d"} Nov 22 04:25:40 crc kubenswrapper[4699]: I1122 04:25:40.520072 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06c1a58a8d4e1eb48a3058172821e63b893287ceda6779dc4f9ac86036e7633d" Nov 22 04:25:40 crc kubenswrapper[4699]: I1122 04:25:40.521154 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f2d9-account-create-9fnts" event={"ID":"8070964f-baae-4437-b0b7-2ff91608f0d7","Type":"ContainerDied","Data":"8839bbea3f91000a46dde2a854ff60eb9fa34ff7b770bfb0620bf6435edeaae8"} Nov 22 04:25:40 crc kubenswrapper[4699]: I1122 04:25:40.521263 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8839bbea3f91000a46dde2a854ff60eb9fa34ff7b770bfb0620bf6435edeaae8" Nov 22 04:25:40 crc kubenswrapper[4699]: I1122 04:25:40.521267 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f2d9-account-create-9fnts" Nov 22 04:25:40 crc kubenswrapper[4699]: I1122 04:25:40.537601 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-t9rdp" podStartSLOduration=3.146858536 podStartE2EDuration="7.537584117s" podCreationTimestamp="2025-11-22 04:25:33 +0000 UTC" firstStartedPulling="2025-11-22 04:25:35.707545001 +0000 UTC m=+1087.050166188" lastFinishedPulling="2025-11-22 04:25:40.098270572 +0000 UTC m=+1091.440891769" observedRunningTime="2025-11-22 04:25:40.533598071 +0000 UTC m=+1091.876219288" watchObservedRunningTime="2025-11-22 04:25:40.537584117 +0000 UTC m=+1091.880205304" Nov 22 04:25:42 crc kubenswrapper[4699]: I1122 04:25:42.292115 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-rtkln"] Nov 22 04:25:42 crc kubenswrapper[4699]: E1122 04:25:42.293048 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c99675e1-93f6-4b73-b4cb-e8f096c3c16e" containerName="mariadb-account-create" Nov 22 04:25:42 crc kubenswrapper[4699]: I1122 04:25:42.293065 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="c99675e1-93f6-4b73-b4cb-e8f096c3c16e" containerName="mariadb-account-create" Nov 22 04:25:42 crc kubenswrapper[4699]: E1122 04:25:42.293090 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e3fa899-c823-4cab-8224-1ca3130f515a" containerName="mariadb-database-create" Nov 22 04:25:42 crc kubenswrapper[4699]: I1122 04:25:42.293097 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e3fa899-c823-4cab-8224-1ca3130f515a" containerName="mariadb-database-create" Nov 22 04:25:42 crc kubenswrapper[4699]: E1122 04:25:42.293109 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8070964f-baae-4437-b0b7-2ff91608f0d7" containerName="mariadb-account-create" Nov 22 04:25:42 crc kubenswrapper[4699]: I1122 04:25:42.293117 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="8070964f-baae-4437-b0b7-2ff91608f0d7" containerName="mariadb-account-create" Nov 22 04:25:42 crc kubenswrapper[4699]: E1122 04:25:42.293136 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8afe223c-55c0-40b3-aa14-ea52cad6bccc" containerName="mariadb-account-create" Nov 22 04:25:42 crc kubenswrapper[4699]: I1122 04:25:42.293142 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="8afe223c-55c0-40b3-aa14-ea52cad6bccc" containerName="mariadb-account-create" Nov 22 04:25:42 crc kubenswrapper[4699]: E1122 04:25:42.293167 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73de98eb-db4a-47f1-b23a-aa38b2db9078" containerName="mariadb-database-create" Nov 22 04:25:42 crc kubenswrapper[4699]: I1122 04:25:42.293173 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="73de98eb-db4a-47f1-b23a-aa38b2db9078" containerName="mariadb-database-create" Nov 22 04:25:42 crc kubenswrapper[4699]: E1122 04:25:42.293182 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe7d25aa-4c77-48b6-88fe-11339dbca63a" containerName="mariadb-database-create" Nov 22 04:25:42 crc kubenswrapper[4699]: I1122 04:25:42.293188 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe7d25aa-4c77-48b6-88fe-11339dbca63a" containerName="mariadb-database-create" Nov 22 04:25:42 crc kubenswrapper[4699]: I1122 04:25:42.293350 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="73de98eb-db4a-47f1-b23a-aa38b2db9078" containerName="mariadb-database-create" Nov 22 04:25:42 crc kubenswrapper[4699]: I1122 04:25:42.293393 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="c99675e1-93f6-4b73-b4cb-e8f096c3c16e" containerName="mariadb-account-create" Nov 22 04:25:42 crc kubenswrapper[4699]: I1122 04:25:42.293403 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="8070964f-baae-4437-b0b7-2ff91608f0d7" containerName="mariadb-account-create" Nov 22 04:25:42 crc kubenswrapper[4699]: I1122 04:25:42.293418 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="8afe223c-55c0-40b3-aa14-ea52cad6bccc" containerName="mariadb-account-create" Nov 22 04:25:42 crc kubenswrapper[4699]: I1122 04:25:42.293448 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe7d25aa-4c77-48b6-88fe-11339dbca63a" containerName="mariadb-database-create" Nov 22 04:25:42 crc kubenswrapper[4699]: I1122 04:25:42.293470 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e3fa899-c823-4cab-8224-1ca3130f515a" containerName="mariadb-database-create" Nov 22 04:25:42 crc kubenswrapper[4699]: I1122 04:25:42.294188 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-rtkln" Nov 22 04:25:42 crc kubenswrapper[4699]: I1122 04:25:42.297689 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Nov 22 04:25:42 crc kubenswrapper[4699]: I1122 04:25:42.298091 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-4kqxm" Nov 22 04:25:42 crc kubenswrapper[4699]: I1122 04:25:42.316012 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-rtkln"] Nov 22 04:25:42 crc kubenswrapper[4699]: I1122 04:25:42.414363 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feebe11e-01d4-44f9-a95d-9b35d3162cfd-combined-ca-bundle\") pod \"glance-db-sync-rtkln\" (UID: \"feebe11e-01d4-44f9-a95d-9b35d3162cfd\") " pod="openstack/glance-db-sync-rtkln" Nov 22 04:25:42 crc kubenswrapper[4699]: I1122 04:25:42.414513 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/feebe11e-01d4-44f9-a95d-9b35d3162cfd-db-sync-config-data\") pod \"glance-db-sync-rtkln\" (UID: \"feebe11e-01d4-44f9-a95d-9b35d3162cfd\") " pod="openstack/glance-db-sync-rtkln" Nov 22 04:25:42 crc kubenswrapper[4699]: I1122 04:25:42.414568 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/feebe11e-01d4-44f9-a95d-9b35d3162cfd-config-data\") pod \"glance-db-sync-rtkln\" (UID: \"feebe11e-01d4-44f9-a95d-9b35d3162cfd\") " pod="openstack/glance-db-sync-rtkln" Nov 22 04:25:42 crc kubenswrapper[4699]: I1122 04:25:42.414595 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75h6g\" (UniqueName: \"kubernetes.io/projected/feebe11e-01d4-44f9-a95d-9b35d3162cfd-kube-api-access-75h6g\") pod \"glance-db-sync-rtkln\" (UID: \"feebe11e-01d4-44f9-a95d-9b35d3162cfd\") " pod="openstack/glance-db-sync-rtkln" Nov 22 04:25:42 crc kubenswrapper[4699]: I1122 04:25:42.516097 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/feebe11e-01d4-44f9-a95d-9b35d3162cfd-db-sync-config-data\") pod \"glance-db-sync-rtkln\" (UID: \"feebe11e-01d4-44f9-a95d-9b35d3162cfd\") " pod="openstack/glance-db-sync-rtkln" Nov 22 04:25:42 crc kubenswrapper[4699]: I1122 04:25:42.516182 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/feebe11e-01d4-44f9-a95d-9b35d3162cfd-config-data\") pod \"glance-db-sync-rtkln\" (UID: \"feebe11e-01d4-44f9-a95d-9b35d3162cfd\") " pod="openstack/glance-db-sync-rtkln" Nov 22 04:25:42 crc kubenswrapper[4699]: I1122 04:25:42.516209 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75h6g\" (UniqueName: \"kubernetes.io/projected/feebe11e-01d4-44f9-a95d-9b35d3162cfd-kube-api-access-75h6g\") pod \"glance-db-sync-rtkln\" (UID: \"feebe11e-01d4-44f9-a95d-9b35d3162cfd\") " pod="openstack/glance-db-sync-rtkln" Nov 22 04:25:42 crc kubenswrapper[4699]: I1122 04:25:42.516328 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feebe11e-01d4-44f9-a95d-9b35d3162cfd-combined-ca-bundle\") pod \"glance-db-sync-rtkln\" (UID: \"feebe11e-01d4-44f9-a95d-9b35d3162cfd\") " pod="openstack/glance-db-sync-rtkln" Nov 22 04:25:42 crc kubenswrapper[4699]: I1122 04:25:42.524646 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/feebe11e-01d4-44f9-a95d-9b35d3162cfd-db-sync-config-data\") pod \"glance-db-sync-rtkln\" (UID: \"feebe11e-01d4-44f9-a95d-9b35d3162cfd\") " pod="openstack/glance-db-sync-rtkln" Nov 22 04:25:42 crc kubenswrapper[4699]: I1122 04:25:42.532064 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/feebe11e-01d4-44f9-a95d-9b35d3162cfd-config-data\") pod \"glance-db-sync-rtkln\" (UID: \"feebe11e-01d4-44f9-a95d-9b35d3162cfd\") " pod="openstack/glance-db-sync-rtkln" Nov 22 04:25:42 crc kubenswrapper[4699]: I1122 04:25:42.533158 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feebe11e-01d4-44f9-a95d-9b35d3162cfd-combined-ca-bundle\") pod \"glance-db-sync-rtkln\" (UID: \"feebe11e-01d4-44f9-a95d-9b35d3162cfd\") " pod="openstack/glance-db-sync-rtkln" Nov 22 04:25:42 crc kubenswrapper[4699]: I1122 04:25:42.548713 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75h6g\" (UniqueName: \"kubernetes.io/projected/feebe11e-01d4-44f9-a95d-9b35d3162cfd-kube-api-access-75h6g\") pod \"glance-db-sync-rtkln\" (UID: \"feebe11e-01d4-44f9-a95d-9b35d3162cfd\") " pod="openstack/glance-db-sync-rtkln" Nov 22 04:25:42 crc kubenswrapper[4699]: I1122 04:25:42.618205 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-rtkln" Nov 22 04:25:43 crc kubenswrapper[4699]: I1122 04:25:43.169195 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-rtkln"] Nov 22 04:25:43 crc kubenswrapper[4699]: I1122 04:25:43.287106 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-s7mlz" podUID="0311366c-c8c7-449c-b617-213a4d87de00" containerName="ovn-controller" probeResult="failure" output=< Nov 22 04:25:43 crc kubenswrapper[4699]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Nov 22 04:25:43 crc kubenswrapper[4699]: > Nov 22 04:25:43 crc kubenswrapper[4699]: I1122 04:25:43.549042 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-rtkln" event={"ID":"feebe11e-01d4-44f9-a95d-9b35d3162cfd","Type":"ContainerStarted","Data":"4e5b3a5379264df3cc048d634634a575501ea4a6b522907a8ee18fba413a1045"} Nov 22 04:25:44 crc kubenswrapper[4699]: I1122 04:25:44.056672 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-bdb7g" Nov 22 04:25:44 crc kubenswrapper[4699]: I1122 04:25:44.125484 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-2glrx"] Nov 22 04:25:44 crc kubenswrapper[4699]: I1122 04:25:44.129235 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-2glrx" podUID="0afac114-a756-478a-a7d0-ec9952944484" containerName="dnsmasq-dns" containerID="cri-o://8021c686740304dde58b0fb0395d3e4f3626980a0d98c6f32214792c465711c6" gracePeriod=10 Nov 22 04:25:44 crc kubenswrapper[4699]: I1122 04:25:44.562118 4699 generic.go:334] "Generic (PLEG): container finished" podID="0afac114-a756-478a-a7d0-ec9952944484" containerID="8021c686740304dde58b0fb0395d3e4f3626980a0d98c6f32214792c465711c6" exitCode=0 Nov 22 04:25:44 crc kubenswrapper[4699]: I1122 04:25:44.562206 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-2glrx" event={"ID":"0afac114-a756-478a-a7d0-ec9952944484","Type":"ContainerDied","Data":"8021c686740304dde58b0fb0395d3e4f3626980a0d98c6f32214792c465711c6"} Nov 22 04:25:45 crc kubenswrapper[4699]: I1122 04:25:45.202159 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-2glrx" Nov 22 04:25:45 crc kubenswrapper[4699]: I1122 04:25:45.276453 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0afac114-a756-478a-a7d0-ec9952944484-config\") pod \"0afac114-a756-478a-a7d0-ec9952944484\" (UID: \"0afac114-a756-478a-a7d0-ec9952944484\") " Nov 22 04:25:45 crc kubenswrapper[4699]: I1122 04:25:45.276816 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0afac114-a756-478a-a7d0-ec9952944484-ovsdbserver-sb\") pod \"0afac114-a756-478a-a7d0-ec9952944484\" (UID: \"0afac114-a756-478a-a7d0-ec9952944484\") " Nov 22 04:25:45 crc kubenswrapper[4699]: I1122 04:25:45.277060 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0afac114-a756-478a-a7d0-ec9952944484-ovsdbserver-nb\") pod \"0afac114-a756-478a-a7d0-ec9952944484\" (UID: \"0afac114-a756-478a-a7d0-ec9952944484\") " Nov 22 04:25:45 crc kubenswrapper[4699]: I1122 04:25:45.277092 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0afac114-a756-478a-a7d0-ec9952944484-dns-svc\") pod \"0afac114-a756-478a-a7d0-ec9952944484\" (UID: \"0afac114-a756-478a-a7d0-ec9952944484\") " Nov 22 04:25:45 crc kubenswrapper[4699]: I1122 04:25:45.277237 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwlkq\" (UniqueName: \"kubernetes.io/projected/0afac114-a756-478a-a7d0-ec9952944484-kube-api-access-wwlkq\") pod \"0afac114-a756-478a-a7d0-ec9952944484\" (UID: \"0afac114-a756-478a-a7d0-ec9952944484\") " Nov 22 04:25:45 crc kubenswrapper[4699]: I1122 04:25:45.284275 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0afac114-a756-478a-a7d0-ec9952944484-kube-api-access-wwlkq" (OuterVolumeSpecName: "kube-api-access-wwlkq") pod "0afac114-a756-478a-a7d0-ec9952944484" (UID: "0afac114-a756-478a-a7d0-ec9952944484"). InnerVolumeSpecName "kube-api-access-wwlkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:25:45 crc kubenswrapper[4699]: I1122 04:25:45.324260 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0afac114-a756-478a-a7d0-ec9952944484-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0afac114-a756-478a-a7d0-ec9952944484" (UID: "0afac114-a756-478a-a7d0-ec9952944484"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:25:45 crc kubenswrapper[4699]: I1122 04:25:45.328094 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0afac114-a756-478a-a7d0-ec9952944484-config" (OuterVolumeSpecName: "config") pod "0afac114-a756-478a-a7d0-ec9952944484" (UID: "0afac114-a756-478a-a7d0-ec9952944484"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:25:45 crc kubenswrapper[4699]: I1122 04:25:45.328131 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0afac114-a756-478a-a7d0-ec9952944484-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0afac114-a756-478a-a7d0-ec9952944484" (UID: "0afac114-a756-478a-a7d0-ec9952944484"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:25:45 crc kubenswrapper[4699]: I1122 04:25:45.339206 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0afac114-a756-478a-a7d0-ec9952944484-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0afac114-a756-478a-a7d0-ec9952944484" (UID: "0afac114-a756-478a-a7d0-ec9952944484"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:25:45 crc kubenswrapper[4699]: I1122 04:25:45.379261 4699 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0afac114-a756-478a-a7d0-ec9952944484-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:45 crc kubenswrapper[4699]: I1122 04:25:45.379301 4699 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0afac114-a756-478a-a7d0-ec9952944484-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:45 crc kubenswrapper[4699]: I1122 04:25:45.379313 4699 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0afac114-a756-478a-a7d0-ec9952944484-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:45 crc kubenswrapper[4699]: I1122 04:25:45.379325 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwlkq\" (UniqueName: \"kubernetes.io/projected/0afac114-a756-478a-a7d0-ec9952944484-kube-api-access-wwlkq\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:45 crc kubenswrapper[4699]: I1122 04:25:45.379339 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0afac114-a756-478a-a7d0-ec9952944484-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:45 crc kubenswrapper[4699]: I1122 04:25:45.575165 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-2glrx" event={"ID":"0afac114-a756-478a-a7d0-ec9952944484","Type":"ContainerDied","Data":"546393b3187cfcae29508620bb9ccef1132e3dd9826bcc477ae9f8f8dd727e66"} Nov 22 04:25:45 crc kubenswrapper[4699]: I1122 04:25:45.575342 4699 scope.go:117] "RemoveContainer" containerID="8021c686740304dde58b0fb0395d3e4f3626980a0d98c6f32214792c465711c6" Nov 22 04:25:45 crc kubenswrapper[4699]: I1122 04:25:45.575638 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-2glrx" Nov 22 04:25:45 crc kubenswrapper[4699]: I1122 04:25:45.602413 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-2glrx"] Nov 22 04:25:45 crc kubenswrapper[4699]: I1122 04:25:45.607538 4699 scope.go:117] "RemoveContainer" containerID="a857da9d5b8dd7f78e00333cd79b2d92e649e91a901cbc758ea4a4505b2f846a" Nov 22 04:25:45 crc kubenswrapper[4699]: I1122 04:25:45.610172 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-2glrx"] Nov 22 04:25:45 crc kubenswrapper[4699]: I1122 04:25:45.886706 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ff24634a-9171-4a5c-b045-4c653e032c18-etc-swift\") pod \"swift-storage-0\" (UID: \"ff24634a-9171-4a5c-b045-4c653e032c18\") " pod="openstack/swift-storage-0" Nov 22 04:25:45 crc kubenswrapper[4699]: E1122 04:25:45.886894 4699 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 22 04:25:45 crc kubenswrapper[4699]: E1122 04:25:45.886924 4699 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 22 04:25:45 crc kubenswrapper[4699]: E1122 04:25:45.886982 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ff24634a-9171-4a5c-b045-4c653e032c18-etc-swift podName:ff24634a-9171-4a5c-b045-4c653e032c18 nodeName:}" failed. No retries permitted until 2025-11-22 04:26:01.886963514 +0000 UTC m=+1113.229584701 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ff24634a-9171-4a5c-b045-4c653e032c18-etc-swift") pod "swift-storage-0" (UID: "ff24634a-9171-4a5c-b045-4c653e032c18") : configmap "swift-ring-files" not found Nov 22 04:25:47 crc kubenswrapper[4699]: I1122 04:25:47.460080 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0afac114-a756-478a-a7d0-ec9952944484" path="/var/lib/kubelet/pods/0afac114-a756-478a-a7d0-ec9952944484/volumes" Nov 22 04:25:48 crc kubenswrapper[4699]: I1122 04:25:48.291868 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-s7mlz" podUID="0311366c-c8c7-449c-b617-213a4d87de00" containerName="ovn-controller" probeResult="failure" output=< Nov 22 04:25:48 crc kubenswrapper[4699]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Nov 22 04:25:48 crc kubenswrapper[4699]: > Nov 22 04:25:48 crc kubenswrapper[4699]: I1122 04:25:48.599648 4699 generic.go:334] "Generic (PLEG): container finished" podID="ed96c0b0-7b76-4f03-b352-461405bbfb23" containerID="cafd0a208383aab3a5d237ef02dd639a43d3f32644a79d8b1f68ab04c4119f8d" exitCode=0 Nov 22 04:25:48 crc kubenswrapper[4699]: I1122 04:25:48.599741 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-t9rdp" event={"ID":"ed96c0b0-7b76-4f03-b352-461405bbfb23","Type":"ContainerDied","Data":"cafd0a208383aab3a5d237ef02dd639a43d3f32644a79d8b1f68ab04c4119f8d"} Nov 22 04:25:49 crc kubenswrapper[4699]: I1122 04:25:49.017460 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Nov 22 04:25:53 crc kubenswrapper[4699]: I1122 04:25:53.293220 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-s7mlz" podUID="0311366c-c8c7-449c-b617-213a4d87de00" containerName="ovn-controller" probeResult="failure" output=< Nov 22 04:25:53 crc kubenswrapper[4699]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Nov 22 04:25:53 crc kubenswrapper[4699]: > Nov 22 04:25:53 crc kubenswrapper[4699]: I1122 04:25:53.310707 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-j7b96" Nov 22 04:25:53 crc kubenswrapper[4699]: I1122 04:25:53.327130 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-j7b96" Nov 22 04:25:53 crc kubenswrapper[4699]: I1122 04:25:53.591742 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-s7mlz-config-74bzz"] Nov 22 04:25:53 crc kubenswrapper[4699]: E1122 04:25:53.592784 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0afac114-a756-478a-a7d0-ec9952944484" containerName="dnsmasq-dns" Nov 22 04:25:53 crc kubenswrapper[4699]: I1122 04:25:53.592801 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="0afac114-a756-478a-a7d0-ec9952944484" containerName="dnsmasq-dns" Nov 22 04:25:53 crc kubenswrapper[4699]: E1122 04:25:53.592827 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0afac114-a756-478a-a7d0-ec9952944484" containerName="init" Nov 22 04:25:53 crc kubenswrapper[4699]: I1122 04:25:53.592833 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="0afac114-a756-478a-a7d0-ec9952944484" containerName="init" Nov 22 04:25:53 crc kubenswrapper[4699]: I1122 04:25:53.592996 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="0afac114-a756-478a-a7d0-ec9952944484" containerName="dnsmasq-dns" Nov 22 04:25:53 crc kubenswrapper[4699]: I1122 04:25:53.593568 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s7mlz-config-74bzz" Nov 22 04:25:53 crc kubenswrapper[4699]: I1122 04:25:53.596067 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Nov 22 04:25:53 crc kubenswrapper[4699]: I1122 04:25:53.617109 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-s7mlz-config-74bzz"] Nov 22 04:25:53 crc kubenswrapper[4699]: I1122 04:25:53.720568 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xptjt\" (UniqueName: \"kubernetes.io/projected/6e5d51d5-3dbb-4375-b090-b588f373647b-kube-api-access-xptjt\") pod \"ovn-controller-s7mlz-config-74bzz\" (UID: \"6e5d51d5-3dbb-4375-b090-b588f373647b\") " pod="openstack/ovn-controller-s7mlz-config-74bzz" Nov 22 04:25:53 crc kubenswrapper[4699]: I1122 04:25:53.720641 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6e5d51d5-3dbb-4375-b090-b588f373647b-additional-scripts\") pod \"ovn-controller-s7mlz-config-74bzz\" (UID: \"6e5d51d5-3dbb-4375-b090-b588f373647b\") " pod="openstack/ovn-controller-s7mlz-config-74bzz" Nov 22 04:25:53 crc kubenswrapper[4699]: I1122 04:25:53.720676 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6e5d51d5-3dbb-4375-b090-b588f373647b-var-log-ovn\") pod \"ovn-controller-s7mlz-config-74bzz\" (UID: \"6e5d51d5-3dbb-4375-b090-b588f373647b\") " pod="openstack/ovn-controller-s7mlz-config-74bzz" Nov 22 04:25:53 crc kubenswrapper[4699]: I1122 04:25:53.720702 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6e5d51d5-3dbb-4375-b090-b588f373647b-scripts\") pod \"ovn-controller-s7mlz-config-74bzz\" (UID: \"6e5d51d5-3dbb-4375-b090-b588f373647b\") " pod="openstack/ovn-controller-s7mlz-config-74bzz" Nov 22 04:25:53 crc kubenswrapper[4699]: I1122 04:25:53.720798 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6e5d51d5-3dbb-4375-b090-b588f373647b-var-run-ovn\") pod \"ovn-controller-s7mlz-config-74bzz\" (UID: \"6e5d51d5-3dbb-4375-b090-b588f373647b\") " pod="openstack/ovn-controller-s7mlz-config-74bzz" Nov 22 04:25:53 crc kubenswrapper[4699]: I1122 04:25:53.720838 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6e5d51d5-3dbb-4375-b090-b588f373647b-var-run\") pod \"ovn-controller-s7mlz-config-74bzz\" (UID: \"6e5d51d5-3dbb-4375-b090-b588f373647b\") " pod="openstack/ovn-controller-s7mlz-config-74bzz" Nov 22 04:25:53 crc kubenswrapper[4699]: I1122 04:25:53.822012 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xptjt\" (UniqueName: \"kubernetes.io/projected/6e5d51d5-3dbb-4375-b090-b588f373647b-kube-api-access-xptjt\") pod \"ovn-controller-s7mlz-config-74bzz\" (UID: \"6e5d51d5-3dbb-4375-b090-b588f373647b\") " pod="openstack/ovn-controller-s7mlz-config-74bzz" Nov 22 04:25:53 crc kubenswrapper[4699]: I1122 04:25:53.822068 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6e5d51d5-3dbb-4375-b090-b588f373647b-additional-scripts\") pod \"ovn-controller-s7mlz-config-74bzz\" (UID: \"6e5d51d5-3dbb-4375-b090-b588f373647b\") " pod="openstack/ovn-controller-s7mlz-config-74bzz" Nov 22 04:25:53 crc kubenswrapper[4699]: I1122 04:25:53.822101 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6e5d51d5-3dbb-4375-b090-b588f373647b-var-log-ovn\") pod \"ovn-controller-s7mlz-config-74bzz\" (UID: \"6e5d51d5-3dbb-4375-b090-b588f373647b\") " pod="openstack/ovn-controller-s7mlz-config-74bzz" Nov 22 04:25:53 crc kubenswrapper[4699]: I1122 04:25:53.822122 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6e5d51d5-3dbb-4375-b090-b588f373647b-scripts\") pod \"ovn-controller-s7mlz-config-74bzz\" (UID: \"6e5d51d5-3dbb-4375-b090-b588f373647b\") " pod="openstack/ovn-controller-s7mlz-config-74bzz" Nov 22 04:25:53 crc kubenswrapper[4699]: I1122 04:25:53.822156 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6e5d51d5-3dbb-4375-b090-b588f373647b-var-run-ovn\") pod \"ovn-controller-s7mlz-config-74bzz\" (UID: \"6e5d51d5-3dbb-4375-b090-b588f373647b\") " pod="openstack/ovn-controller-s7mlz-config-74bzz" Nov 22 04:25:53 crc kubenswrapper[4699]: I1122 04:25:53.822184 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6e5d51d5-3dbb-4375-b090-b588f373647b-var-run\") pod \"ovn-controller-s7mlz-config-74bzz\" (UID: \"6e5d51d5-3dbb-4375-b090-b588f373647b\") " pod="openstack/ovn-controller-s7mlz-config-74bzz" Nov 22 04:25:53 crc kubenswrapper[4699]: I1122 04:25:53.822474 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6e5d51d5-3dbb-4375-b090-b588f373647b-var-run\") pod \"ovn-controller-s7mlz-config-74bzz\" (UID: \"6e5d51d5-3dbb-4375-b090-b588f373647b\") " pod="openstack/ovn-controller-s7mlz-config-74bzz" Nov 22 04:25:53 crc kubenswrapper[4699]: I1122 04:25:53.822525 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6e5d51d5-3dbb-4375-b090-b588f373647b-var-run-ovn\") pod \"ovn-controller-s7mlz-config-74bzz\" (UID: \"6e5d51d5-3dbb-4375-b090-b588f373647b\") " pod="openstack/ovn-controller-s7mlz-config-74bzz" Nov 22 04:25:53 crc kubenswrapper[4699]: I1122 04:25:53.822882 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6e5d51d5-3dbb-4375-b090-b588f373647b-var-log-ovn\") pod \"ovn-controller-s7mlz-config-74bzz\" (UID: \"6e5d51d5-3dbb-4375-b090-b588f373647b\") " pod="openstack/ovn-controller-s7mlz-config-74bzz" Nov 22 04:25:53 crc kubenswrapper[4699]: I1122 04:25:53.823002 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6e5d51d5-3dbb-4375-b090-b588f373647b-additional-scripts\") pod \"ovn-controller-s7mlz-config-74bzz\" (UID: \"6e5d51d5-3dbb-4375-b090-b588f373647b\") " pod="openstack/ovn-controller-s7mlz-config-74bzz" Nov 22 04:25:53 crc kubenswrapper[4699]: I1122 04:25:53.824130 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6e5d51d5-3dbb-4375-b090-b588f373647b-scripts\") pod \"ovn-controller-s7mlz-config-74bzz\" (UID: \"6e5d51d5-3dbb-4375-b090-b588f373647b\") " pod="openstack/ovn-controller-s7mlz-config-74bzz" Nov 22 04:25:53 crc kubenswrapper[4699]: I1122 04:25:53.850129 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xptjt\" (UniqueName: \"kubernetes.io/projected/6e5d51d5-3dbb-4375-b090-b588f373647b-kube-api-access-xptjt\") pod \"ovn-controller-s7mlz-config-74bzz\" (UID: \"6e5d51d5-3dbb-4375-b090-b588f373647b\") " pod="openstack/ovn-controller-s7mlz-config-74bzz" Nov 22 04:25:53 crc kubenswrapper[4699]: I1122 04:25:53.924336 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s7mlz-config-74bzz" Nov 22 04:25:54 crc kubenswrapper[4699]: I1122 04:25:54.646881 4699 generic.go:334] "Generic (PLEG): container finished" podID="43d42bf1-de55-49eb-990f-451ad31d0e21" containerID="54cce131e13a928cbbe825a4a23558e4febfb3b63b057f90b289d3f8f7b28d49" exitCode=0 Nov 22 04:25:54 crc kubenswrapper[4699]: I1122 04:25:54.646970 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"43d42bf1-de55-49eb-990f-451ad31d0e21","Type":"ContainerDied","Data":"54cce131e13a928cbbe825a4a23558e4febfb3b63b057f90b289d3f8f7b28d49"} Nov 22 04:25:54 crc kubenswrapper[4699]: I1122 04:25:54.648782 4699 generic.go:334] "Generic (PLEG): container finished" podID="964a7a4a-f709-43ea-85f2-93a8273d503d" containerID="922cc6338789229c693e736b86854e08f2e6f96b709403c471b4ab8e464ef1b4" exitCode=0 Nov 22 04:25:54 crc kubenswrapper[4699]: I1122 04:25:54.648792 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"964a7a4a-f709-43ea-85f2-93a8273d503d","Type":"ContainerDied","Data":"922cc6338789229c693e736b86854e08f2e6f96b709403c471b4ab8e464ef1b4"} Nov 22 04:25:55 crc kubenswrapper[4699]: I1122 04:25:55.883802 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-t9rdp" Nov 22 04:25:55 crc kubenswrapper[4699]: I1122 04:25:55.958297 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ed96c0b0-7b76-4f03-b352-461405bbfb23-swiftconf\") pod \"ed96c0b0-7b76-4f03-b352-461405bbfb23\" (UID: \"ed96c0b0-7b76-4f03-b352-461405bbfb23\") " Nov 22 04:25:55 crc kubenswrapper[4699]: I1122 04:25:55.958476 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ed96c0b0-7b76-4f03-b352-461405bbfb23-etc-swift\") pod \"ed96c0b0-7b76-4f03-b352-461405bbfb23\" (UID: \"ed96c0b0-7b76-4f03-b352-461405bbfb23\") " Nov 22 04:25:55 crc kubenswrapper[4699]: I1122 04:25:55.959404 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed96c0b0-7b76-4f03-b352-461405bbfb23-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ed96c0b0-7b76-4f03-b352-461405bbfb23" (UID: "ed96c0b0-7b76-4f03-b352-461405bbfb23"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:25:55 crc kubenswrapper[4699]: I1122 04:25:55.959491 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lp7zb\" (UniqueName: \"kubernetes.io/projected/ed96c0b0-7b76-4f03-b352-461405bbfb23-kube-api-access-lp7zb\") pod \"ed96c0b0-7b76-4f03-b352-461405bbfb23\" (UID: \"ed96c0b0-7b76-4f03-b352-461405bbfb23\") " Nov 22 04:25:55 crc kubenswrapper[4699]: I1122 04:25:55.959971 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ed96c0b0-7b76-4f03-b352-461405bbfb23-scripts\") pod \"ed96c0b0-7b76-4f03-b352-461405bbfb23\" (UID: \"ed96c0b0-7b76-4f03-b352-461405bbfb23\") " Nov 22 04:25:55 crc kubenswrapper[4699]: I1122 04:25:55.960263 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed96c0b0-7b76-4f03-b352-461405bbfb23-combined-ca-bundle\") pod \"ed96c0b0-7b76-4f03-b352-461405bbfb23\" (UID: \"ed96c0b0-7b76-4f03-b352-461405bbfb23\") " Nov 22 04:25:55 crc kubenswrapper[4699]: I1122 04:25:55.960317 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ed96c0b0-7b76-4f03-b352-461405bbfb23-dispersionconf\") pod \"ed96c0b0-7b76-4f03-b352-461405bbfb23\" (UID: \"ed96c0b0-7b76-4f03-b352-461405bbfb23\") " Nov 22 04:25:55 crc kubenswrapper[4699]: I1122 04:25:55.960636 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ed96c0b0-7b76-4f03-b352-461405bbfb23-ring-data-devices\") pod \"ed96c0b0-7b76-4f03-b352-461405bbfb23\" (UID: \"ed96c0b0-7b76-4f03-b352-461405bbfb23\") " Nov 22 04:25:55 crc kubenswrapper[4699]: I1122 04:25:55.961365 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed96c0b0-7b76-4f03-b352-461405bbfb23-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "ed96c0b0-7b76-4f03-b352-461405bbfb23" (UID: "ed96c0b0-7b76-4f03-b352-461405bbfb23"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:25:55 crc kubenswrapper[4699]: I1122 04:25:55.961681 4699 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ed96c0b0-7b76-4f03-b352-461405bbfb23-etc-swift\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:55 crc kubenswrapper[4699]: I1122 04:25:55.961704 4699 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ed96c0b0-7b76-4f03-b352-461405bbfb23-ring-data-devices\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:55 crc kubenswrapper[4699]: I1122 04:25:55.963683 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed96c0b0-7b76-4f03-b352-461405bbfb23-kube-api-access-lp7zb" (OuterVolumeSpecName: "kube-api-access-lp7zb") pod "ed96c0b0-7b76-4f03-b352-461405bbfb23" (UID: "ed96c0b0-7b76-4f03-b352-461405bbfb23"). InnerVolumeSpecName "kube-api-access-lp7zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:25:55 crc kubenswrapper[4699]: I1122 04:25:55.966574 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed96c0b0-7b76-4f03-b352-461405bbfb23-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "ed96c0b0-7b76-4f03-b352-461405bbfb23" (UID: "ed96c0b0-7b76-4f03-b352-461405bbfb23"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:25:55 crc kubenswrapper[4699]: I1122 04:25:55.980354 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed96c0b0-7b76-4f03-b352-461405bbfb23-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "ed96c0b0-7b76-4f03-b352-461405bbfb23" (UID: "ed96c0b0-7b76-4f03-b352-461405bbfb23"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:25:55 crc kubenswrapper[4699]: I1122 04:25:55.980897 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed96c0b0-7b76-4f03-b352-461405bbfb23-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed96c0b0-7b76-4f03-b352-461405bbfb23" (UID: "ed96c0b0-7b76-4f03-b352-461405bbfb23"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:25:55 crc kubenswrapper[4699]: I1122 04:25:55.990606 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed96c0b0-7b76-4f03-b352-461405bbfb23-scripts" (OuterVolumeSpecName: "scripts") pod "ed96c0b0-7b76-4f03-b352-461405bbfb23" (UID: "ed96c0b0-7b76-4f03-b352-461405bbfb23"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:25:56 crc kubenswrapper[4699]: I1122 04:25:56.063355 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lp7zb\" (UniqueName: \"kubernetes.io/projected/ed96c0b0-7b76-4f03-b352-461405bbfb23-kube-api-access-lp7zb\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:56 crc kubenswrapper[4699]: I1122 04:25:56.063390 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ed96c0b0-7b76-4f03-b352-461405bbfb23-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:56 crc kubenswrapper[4699]: I1122 04:25:56.063400 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed96c0b0-7b76-4f03-b352-461405bbfb23-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:56 crc kubenswrapper[4699]: I1122 04:25:56.063408 4699 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ed96c0b0-7b76-4f03-b352-461405bbfb23-dispersionconf\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:56 crc kubenswrapper[4699]: I1122 04:25:56.063489 4699 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ed96c0b0-7b76-4f03-b352-461405bbfb23-swiftconf\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:56 crc kubenswrapper[4699]: I1122 04:25:56.120602 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-s7mlz-config-74bzz"] Nov 22 04:25:56 crc kubenswrapper[4699]: I1122 04:25:56.665482 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-t9rdp" Nov 22 04:25:56 crc kubenswrapper[4699]: I1122 04:25:56.665491 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-t9rdp" event={"ID":"ed96c0b0-7b76-4f03-b352-461405bbfb23","Type":"ContainerDied","Data":"25705f511b40cf6291aaccc4eea5af1e5a9f8cd53b1df1abfe2f62335438b39e"} Nov 22 04:25:56 crc kubenswrapper[4699]: I1122 04:25:56.666183 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25705f511b40cf6291aaccc4eea5af1e5a9f8cd53b1df1abfe2f62335438b39e" Nov 22 04:25:56 crc kubenswrapper[4699]: I1122 04:25:56.666993 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s7mlz-config-74bzz" event={"ID":"6e5d51d5-3dbb-4375-b090-b588f373647b","Type":"ContainerStarted","Data":"c32e2d1bb5fc1bc7f8e4e31e454af69322bf5fa9a4fdd1bf53a07d6c8999d8ed"} Nov 22 04:25:56 crc kubenswrapper[4699]: I1122 04:25:56.667029 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s7mlz-config-74bzz" event={"ID":"6e5d51d5-3dbb-4375-b090-b588f373647b","Type":"ContainerStarted","Data":"f9e1dc90b4291bca1b1190236ae4e75a56a38ff8848f0cd715152af5da7d1fb4"} Nov 22 04:25:57 crc kubenswrapper[4699]: I1122 04:25:57.677858 4699 generic.go:334] "Generic (PLEG): container finished" podID="6e5d51d5-3dbb-4375-b090-b588f373647b" containerID="c32e2d1bb5fc1bc7f8e4e31e454af69322bf5fa9a4fdd1bf53a07d6c8999d8ed" exitCode=0 Nov 22 04:25:57 crc kubenswrapper[4699]: I1122 04:25:57.677973 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s7mlz-config-74bzz" event={"ID":"6e5d51d5-3dbb-4375-b090-b588f373647b","Type":"ContainerDied","Data":"c32e2d1bb5fc1bc7f8e4e31e454af69322bf5fa9a4fdd1bf53a07d6c8999d8ed"} Nov 22 04:25:57 crc kubenswrapper[4699]: I1122 04:25:57.681560 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"43d42bf1-de55-49eb-990f-451ad31d0e21","Type":"ContainerStarted","Data":"46c87d2a62fb4ca084c1e3345da9baf035ce1b5a33e1a4c13e4dfe636abd19e2"} Nov 22 04:25:57 crc kubenswrapper[4699]: I1122 04:25:57.681892 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:25:57 crc kubenswrapper[4699]: I1122 04:25:57.684507 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"964a7a4a-f709-43ea-85f2-93a8273d503d","Type":"ContainerStarted","Data":"bb13455a4929110bd93e56258064b66566763c83228e0e788449171fda6f4857"} Nov 22 04:25:57 crc kubenswrapper[4699]: I1122 04:25:57.684795 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Nov 22 04:25:57 crc kubenswrapper[4699]: I1122 04:25:57.716292 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.94469927 podStartE2EDuration="1m55.716275548s" podCreationTimestamp="2025-11-22 04:24:02 +0000 UTC" firstStartedPulling="2025-11-22 04:24:03.978053202 +0000 UTC m=+995.320674389" lastFinishedPulling="2025-11-22 04:25:20.74962948 +0000 UTC m=+1072.092250667" observedRunningTime="2025-11-22 04:25:57.713919111 +0000 UTC m=+1109.056540298" watchObservedRunningTime="2025-11-22 04:25:57.716275548 +0000 UTC m=+1109.058896735" Nov 22 04:25:57 crc kubenswrapper[4699]: I1122 04:25:57.740740 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=39.514207604 podStartE2EDuration="1m55.740719669s" podCreationTimestamp="2025-11-22 04:24:02 +0000 UTC" firstStartedPulling="2025-11-22 04:24:04.341480272 +0000 UTC m=+995.684101449" lastFinishedPulling="2025-11-22 04:25:20.567992327 +0000 UTC m=+1071.910613514" observedRunningTime="2025-11-22 04:25:57.733762491 +0000 UTC m=+1109.076383698" watchObservedRunningTime="2025-11-22 04:25:57.740719669 +0000 UTC m=+1109.083340856" Nov 22 04:25:58 crc kubenswrapper[4699]: I1122 04:25:58.292249 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-s7mlz" Nov 22 04:25:58 crc kubenswrapper[4699]: I1122 04:25:58.695906 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-rtkln" event={"ID":"feebe11e-01d4-44f9-a95d-9b35d3162cfd","Type":"ContainerStarted","Data":"0bb047593a6b282bfac14d88287ef8f339b80ee2b324fc6cd2220f5fcb8cf7da"} Nov 22 04:25:59 crc kubenswrapper[4699]: I1122 04:25:59.009994 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s7mlz-config-74bzz" Nov 22 04:25:59 crc kubenswrapper[4699]: I1122 04:25:59.027945 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-rtkln" podStartSLOduration=3.182832663 podStartE2EDuration="17.027925051s" podCreationTimestamp="2025-11-22 04:25:42 +0000 UTC" firstStartedPulling="2025-11-22 04:25:43.177585776 +0000 UTC m=+1094.520206993" lastFinishedPulling="2025-11-22 04:25:57.022678204 +0000 UTC m=+1108.365299381" observedRunningTime="2025-11-22 04:25:58.729016752 +0000 UTC m=+1110.071637949" watchObservedRunningTime="2025-11-22 04:25:59.027925051 +0000 UTC m=+1110.370546238" Nov 22 04:25:59 crc kubenswrapper[4699]: I1122 04:25:59.113750 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6e5d51d5-3dbb-4375-b090-b588f373647b-scripts\") pod \"6e5d51d5-3dbb-4375-b090-b588f373647b\" (UID: \"6e5d51d5-3dbb-4375-b090-b588f373647b\") " Nov 22 04:25:59 crc kubenswrapper[4699]: I1122 04:25:59.113793 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6e5d51d5-3dbb-4375-b090-b588f373647b-var-run-ovn\") pod \"6e5d51d5-3dbb-4375-b090-b588f373647b\" (UID: \"6e5d51d5-3dbb-4375-b090-b588f373647b\") " Nov 22 04:25:59 crc kubenswrapper[4699]: I1122 04:25:59.113861 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6e5d51d5-3dbb-4375-b090-b588f373647b-additional-scripts\") pod \"6e5d51d5-3dbb-4375-b090-b588f373647b\" (UID: \"6e5d51d5-3dbb-4375-b090-b588f373647b\") " Nov 22 04:25:59 crc kubenswrapper[4699]: I1122 04:25:59.113917 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6e5d51d5-3dbb-4375-b090-b588f373647b-var-log-ovn\") pod \"6e5d51d5-3dbb-4375-b090-b588f373647b\" (UID: \"6e5d51d5-3dbb-4375-b090-b588f373647b\") " Nov 22 04:25:59 crc kubenswrapper[4699]: I1122 04:25:59.113929 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6e5d51d5-3dbb-4375-b090-b588f373647b-var-run\") pod \"6e5d51d5-3dbb-4375-b090-b588f373647b\" (UID: \"6e5d51d5-3dbb-4375-b090-b588f373647b\") " Nov 22 04:25:59 crc kubenswrapper[4699]: I1122 04:25:59.113960 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e5d51d5-3dbb-4375-b090-b588f373647b-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "6e5d51d5-3dbb-4375-b090-b588f373647b" (UID: "6e5d51d5-3dbb-4375-b090-b588f373647b"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:25:59 crc kubenswrapper[4699]: I1122 04:25:59.113990 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xptjt\" (UniqueName: \"kubernetes.io/projected/6e5d51d5-3dbb-4375-b090-b588f373647b-kube-api-access-xptjt\") pod \"6e5d51d5-3dbb-4375-b090-b588f373647b\" (UID: \"6e5d51d5-3dbb-4375-b090-b588f373647b\") " Nov 22 04:25:59 crc kubenswrapper[4699]: I1122 04:25:59.114295 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e5d51d5-3dbb-4375-b090-b588f373647b-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "6e5d51d5-3dbb-4375-b090-b588f373647b" (UID: "6e5d51d5-3dbb-4375-b090-b588f373647b"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:25:59 crc kubenswrapper[4699]: I1122 04:25:59.114365 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e5d51d5-3dbb-4375-b090-b588f373647b-var-run" (OuterVolumeSpecName: "var-run") pod "6e5d51d5-3dbb-4375-b090-b588f373647b" (UID: "6e5d51d5-3dbb-4375-b090-b588f373647b"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:25:59 crc kubenswrapper[4699]: I1122 04:25:59.114920 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e5d51d5-3dbb-4375-b090-b588f373647b-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "6e5d51d5-3dbb-4375-b090-b588f373647b" (UID: "6e5d51d5-3dbb-4375-b090-b588f373647b"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:25:59 crc kubenswrapper[4699]: I1122 04:25:59.114947 4699 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6e5d51d5-3dbb-4375-b090-b588f373647b-var-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:59 crc kubenswrapper[4699]: I1122 04:25:59.114980 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e5d51d5-3dbb-4375-b090-b588f373647b-scripts" (OuterVolumeSpecName: "scripts") pod "6e5d51d5-3dbb-4375-b090-b588f373647b" (UID: "6e5d51d5-3dbb-4375-b090-b588f373647b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:25:59 crc kubenswrapper[4699]: I1122 04:25:59.115005 4699 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6e5d51d5-3dbb-4375-b090-b588f373647b-var-log-ovn\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:59 crc kubenswrapper[4699]: I1122 04:25:59.115022 4699 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6e5d51d5-3dbb-4375-b090-b588f373647b-var-run\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:59 crc kubenswrapper[4699]: I1122 04:25:59.121744 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e5d51d5-3dbb-4375-b090-b588f373647b-kube-api-access-xptjt" (OuterVolumeSpecName: "kube-api-access-xptjt") pod "6e5d51d5-3dbb-4375-b090-b588f373647b" (UID: "6e5d51d5-3dbb-4375-b090-b588f373647b"). InnerVolumeSpecName "kube-api-access-xptjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:25:59 crc kubenswrapper[4699]: I1122 04:25:59.216378 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6e5d51d5-3dbb-4375-b090-b588f373647b-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:59 crc kubenswrapper[4699]: I1122 04:25:59.216425 4699 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6e5d51d5-3dbb-4375-b090-b588f373647b-additional-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:59 crc kubenswrapper[4699]: I1122 04:25:59.216455 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xptjt\" (UniqueName: \"kubernetes.io/projected/6e5d51d5-3dbb-4375-b090-b588f373647b-kube-api-access-xptjt\") on node \"crc\" DevicePath \"\"" Nov 22 04:25:59 crc kubenswrapper[4699]: I1122 04:25:59.703842 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s7mlz-config-74bzz" event={"ID":"6e5d51d5-3dbb-4375-b090-b588f373647b","Type":"ContainerDied","Data":"f9e1dc90b4291bca1b1190236ae4e75a56a38ff8848f0cd715152af5da7d1fb4"} Nov 22 04:25:59 crc kubenswrapper[4699]: I1122 04:25:59.703893 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9e1dc90b4291bca1b1190236ae4e75a56a38ff8848f0cd715152af5da7d1fb4" Nov 22 04:25:59 crc kubenswrapper[4699]: I1122 04:25:59.703862 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s7mlz-config-74bzz" Nov 22 04:26:00 crc kubenswrapper[4699]: I1122 04:26:00.117146 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-s7mlz-config-74bzz"] Nov 22 04:26:00 crc kubenswrapper[4699]: I1122 04:26:00.122618 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-s7mlz-config-74bzz"] Nov 22 04:26:01 crc kubenswrapper[4699]: I1122 04:26:01.459703 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e5d51d5-3dbb-4375-b090-b588f373647b" path="/var/lib/kubelet/pods/6e5d51d5-3dbb-4375-b090-b588f373647b/volumes" Nov 22 04:26:01 crc kubenswrapper[4699]: I1122 04:26:01.960601 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ff24634a-9171-4a5c-b045-4c653e032c18-etc-swift\") pod \"swift-storage-0\" (UID: \"ff24634a-9171-4a5c-b045-4c653e032c18\") " pod="openstack/swift-storage-0" Nov 22 04:26:01 crc kubenswrapper[4699]: I1122 04:26:01.980555 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ff24634a-9171-4a5c-b045-4c653e032c18-etc-swift\") pod \"swift-storage-0\" (UID: \"ff24634a-9171-4a5c-b045-4c653e032c18\") " pod="openstack/swift-storage-0" Nov 22 04:26:02 crc kubenswrapper[4699]: I1122 04:26:02.028342 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Nov 22 04:26:02 crc kubenswrapper[4699]: I1122 04:26:02.549120 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Nov 22 04:26:02 crc kubenswrapper[4699]: W1122 04:26:02.552194 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff24634a_9171_4a5c_b045_4c653e032c18.slice/crio-487a4a624cb7cf37694c881ec0a30e747c1d513e0ff877c8acf1a16b3496880b WatchSource:0}: Error finding container 487a4a624cb7cf37694c881ec0a30e747c1d513e0ff877c8acf1a16b3496880b: Status 404 returned error can't find the container with id 487a4a624cb7cf37694c881ec0a30e747c1d513e0ff877c8acf1a16b3496880b Nov 22 04:26:02 crc kubenswrapper[4699]: I1122 04:26:02.725217 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ff24634a-9171-4a5c-b045-4c653e032c18","Type":"ContainerStarted","Data":"487a4a624cb7cf37694c881ec0a30e747c1d513e0ff877c8acf1a16b3496880b"} Nov 22 04:26:04 crc kubenswrapper[4699]: I1122 04:26:04.741570 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ff24634a-9171-4a5c-b045-4c653e032c18","Type":"ContainerStarted","Data":"9d4cc7830ffbc65cc5d808620d5042b6883e4cfa9253136fe7a7eab7ad2d0063"} Nov 22 04:26:04 crc kubenswrapper[4699]: I1122 04:26:04.741935 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ff24634a-9171-4a5c-b045-4c653e032c18","Type":"ContainerStarted","Data":"9a2337fa8ab2d4696f0b43debe9023fdf8e62fbab3b679350e25bf7b4f600e22"} Nov 22 04:26:04 crc kubenswrapper[4699]: I1122 04:26:04.741950 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ff24634a-9171-4a5c-b045-4c653e032c18","Type":"ContainerStarted","Data":"dd639c9ccbdb1bd490664332ef822c721ac4e5d0f7578529ecad9b3f6f9cc27c"} Nov 22 04:26:04 crc kubenswrapper[4699]: I1122 04:26:04.741964 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ff24634a-9171-4a5c-b045-4c653e032c18","Type":"ContainerStarted","Data":"2a27cf8e58bf13287fc59f0b9a0e9628fba8678b03673b1526d9aacc5663ae53"} Nov 22 04:26:05 crc kubenswrapper[4699]: I1122 04:26:05.752041 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ff24634a-9171-4a5c-b045-4c653e032c18","Type":"ContainerStarted","Data":"1b57f793455100a5d4fd83bb50e8a03597759512f3e3a7f670e970136997ad74"} Nov 22 04:26:05 crc kubenswrapper[4699]: I1122 04:26:05.847888 4699 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50] : Timed out while waiting for systemd to remove kubepods-besteffort-pod5ffa3cb0_0c6f_4fd1_9325_c4b2cd672a50.slice" Nov 22 04:26:05 crc kubenswrapper[4699]: E1122 04:26:05.848013 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50] : Timed out while waiting for systemd to remove kubepods-besteffort-pod5ffa3cb0_0c6f_4fd1_9325_c4b2cd672a50.slice" pod="openstack/swift-ring-rebalance-w4rqf" podUID="5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50" Nov 22 04:26:06 crc kubenswrapper[4699]: I1122 04:26:06.767670 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ff24634a-9171-4a5c-b045-4c653e032c18","Type":"ContainerStarted","Data":"f1eb6881a5ba904e1f5b7094a12bbdfc5da595392aee94d8739d9e348436cf8b"} Nov 22 04:26:06 crc kubenswrapper[4699]: I1122 04:26:06.768013 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ff24634a-9171-4a5c-b045-4c653e032c18","Type":"ContainerStarted","Data":"88ad873c963dfe9b7d4ff84be759e7faf4c290cd341413ff79ceb7c4ebb82805"} Nov 22 04:26:06 crc kubenswrapper[4699]: I1122 04:26:06.768023 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ff24634a-9171-4a5c-b045-4c653e032c18","Type":"ContainerStarted","Data":"ce271090972b5616706ab2ceedf79e1973d9c2f33497fd054204341b2167f243"} Nov 22 04:26:06 crc kubenswrapper[4699]: I1122 04:26:06.769626 4699 generic.go:334] "Generic (PLEG): container finished" podID="feebe11e-01d4-44f9-a95d-9b35d3162cfd" containerID="0bb047593a6b282bfac14d88287ef8f339b80ee2b324fc6cd2220f5fcb8cf7da" exitCode=0 Nov 22 04:26:06 crc kubenswrapper[4699]: I1122 04:26:06.769680 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-w4rqf" Nov 22 04:26:06 crc kubenswrapper[4699]: I1122 04:26:06.769806 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-rtkln" event={"ID":"feebe11e-01d4-44f9-a95d-9b35d3162cfd","Type":"ContainerDied","Data":"0bb047593a6b282bfac14d88287ef8f339b80ee2b324fc6cd2220f5fcb8cf7da"} Nov 22 04:26:06 crc kubenswrapper[4699]: I1122 04:26:06.825924 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-w4rqf"] Nov 22 04:26:06 crc kubenswrapper[4699]: I1122 04:26:06.836601 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-w4rqf"] Nov 22 04:26:07 crc kubenswrapper[4699]: I1122 04:26:07.457381 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50" path="/var/lib/kubelet/pods/5ffa3cb0-0c6f-4fd1-9325-c4b2cd672a50/volumes" Nov 22 04:26:08 crc kubenswrapper[4699]: I1122 04:26:08.135404 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-rtkln" Nov 22 04:26:08 crc kubenswrapper[4699]: I1122 04:26:08.263301 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/feebe11e-01d4-44f9-a95d-9b35d3162cfd-db-sync-config-data\") pod \"feebe11e-01d4-44f9-a95d-9b35d3162cfd\" (UID: \"feebe11e-01d4-44f9-a95d-9b35d3162cfd\") " Nov 22 04:26:08 crc kubenswrapper[4699]: I1122 04:26:08.263375 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/feebe11e-01d4-44f9-a95d-9b35d3162cfd-config-data\") pod \"feebe11e-01d4-44f9-a95d-9b35d3162cfd\" (UID: \"feebe11e-01d4-44f9-a95d-9b35d3162cfd\") " Nov 22 04:26:08 crc kubenswrapper[4699]: I1122 04:26:08.263425 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feebe11e-01d4-44f9-a95d-9b35d3162cfd-combined-ca-bundle\") pod \"feebe11e-01d4-44f9-a95d-9b35d3162cfd\" (UID: \"feebe11e-01d4-44f9-a95d-9b35d3162cfd\") " Nov 22 04:26:08 crc kubenswrapper[4699]: I1122 04:26:08.263567 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75h6g\" (UniqueName: \"kubernetes.io/projected/feebe11e-01d4-44f9-a95d-9b35d3162cfd-kube-api-access-75h6g\") pod \"feebe11e-01d4-44f9-a95d-9b35d3162cfd\" (UID: \"feebe11e-01d4-44f9-a95d-9b35d3162cfd\") " Nov 22 04:26:08 crc kubenswrapper[4699]: I1122 04:26:08.272295 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/feebe11e-01d4-44f9-a95d-9b35d3162cfd-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "feebe11e-01d4-44f9-a95d-9b35d3162cfd" (UID: "feebe11e-01d4-44f9-a95d-9b35d3162cfd"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:26:08 crc kubenswrapper[4699]: I1122 04:26:08.273026 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/feebe11e-01d4-44f9-a95d-9b35d3162cfd-kube-api-access-75h6g" (OuterVolumeSpecName: "kube-api-access-75h6g") pod "feebe11e-01d4-44f9-a95d-9b35d3162cfd" (UID: "feebe11e-01d4-44f9-a95d-9b35d3162cfd"). InnerVolumeSpecName "kube-api-access-75h6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:26:08 crc kubenswrapper[4699]: I1122 04:26:08.297167 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/feebe11e-01d4-44f9-a95d-9b35d3162cfd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "feebe11e-01d4-44f9-a95d-9b35d3162cfd" (UID: "feebe11e-01d4-44f9-a95d-9b35d3162cfd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:26:08 crc kubenswrapper[4699]: I1122 04:26:08.318212 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/feebe11e-01d4-44f9-a95d-9b35d3162cfd-config-data" (OuterVolumeSpecName: "config-data") pod "feebe11e-01d4-44f9-a95d-9b35d3162cfd" (UID: "feebe11e-01d4-44f9-a95d-9b35d3162cfd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:26:08 crc kubenswrapper[4699]: I1122 04:26:08.365225 4699 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/feebe11e-01d4-44f9-a95d-9b35d3162cfd-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:08 crc kubenswrapper[4699]: I1122 04:26:08.365518 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/feebe11e-01d4-44f9-a95d-9b35d3162cfd-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:08 crc kubenswrapper[4699]: I1122 04:26:08.365594 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feebe11e-01d4-44f9-a95d-9b35d3162cfd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:08 crc kubenswrapper[4699]: I1122 04:26:08.365711 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75h6g\" (UniqueName: \"kubernetes.io/projected/feebe11e-01d4-44f9-a95d-9b35d3162cfd-kube-api-access-75h6g\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:08 crc kubenswrapper[4699]: I1122 04:26:08.786256 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-rtkln" Nov 22 04:26:08 crc kubenswrapper[4699]: I1122 04:26:08.786515 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-rtkln" event={"ID":"feebe11e-01d4-44f9-a95d-9b35d3162cfd","Type":"ContainerDied","Data":"4e5b3a5379264df3cc048d634634a575501ea4a6b522907a8ee18fba413a1045"} Nov 22 04:26:08 crc kubenswrapper[4699]: I1122 04:26:08.786795 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e5b3a5379264df3cc048d634634a575501ea4a6b522907a8ee18fba413a1045" Nov 22 04:26:08 crc kubenswrapper[4699]: I1122 04:26:08.825216 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ff24634a-9171-4a5c-b045-4c653e032c18","Type":"ContainerStarted","Data":"d7fc738a1075d31ddc20ddf970d6e0f76d02ab9bd8582e26b9618db43b630f3f"} Nov 22 04:26:08 crc kubenswrapper[4699]: I1122 04:26:08.825266 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ff24634a-9171-4a5c-b045-4c653e032c18","Type":"ContainerStarted","Data":"f500dd8784707501b949718ff51f3f664b1d053fd4defc92d51de3ef402de9e3"} Nov 22 04:26:08 crc kubenswrapper[4699]: I1122 04:26:08.825279 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ff24634a-9171-4a5c-b045-4c653e032c18","Type":"ContainerStarted","Data":"91dd96f323ddba98c2ba60a4d17710573333f18e65d0999e37bceff8451c453e"} Nov 22 04:26:08 crc kubenswrapper[4699]: I1122 04:26:08.825289 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ff24634a-9171-4a5c-b045-4c653e032c18","Type":"ContainerStarted","Data":"e957732f4a990f1f7b2a39171629e160976499d29ee345f3c0aaa31f0ec3a043"} Nov 22 04:26:08 crc kubenswrapper[4699]: I1122 04:26:08.825317 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ff24634a-9171-4a5c-b045-4c653e032c18","Type":"ContainerStarted","Data":"fba0b2a96d7b67d0c2f032a426a68380537a5cb05849699bb09f4bb4fcf51103"} Nov 22 04:26:09 crc kubenswrapper[4699]: I1122 04:26:09.224481 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-npt6m"] Nov 22 04:26:09 crc kubenswrapper[4699]: E1122 04:26:09.225267 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e5d51d5-3dbb-4375-b090-b588f373647b" containerName="ovn-config" Nov 22 04:26:09 crc kubenswrapper[4699]: I1122 04:26:09.225284 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e5d51d5-3dbb-4375-b090-b588f373647b" containerName="ovn-config" Nov 22 04:26:09 crc kubenswrapper[4699]: E1122 04:26:09.225303 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feebe11e-01d4-44f9-a95d-9b35d3162cfd" containerName="glance-db-sync" Nov 22 04:26:09 crc kubenswrapper[4699]: I1122 04:26:09.225311 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="feebe11e-01d4-44f9-a95d-9b35d3162cfd" containerName="glance-db-sync" Nov 22 04:26:09 crc kubenswrapper[4699]: E1122 04:26:09.225323 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed96c0b0-7b76-4f03-b352-461405bbfb23" containerName="swift-ring-rebalance" Nov 22 04:26:09 crc kubenswrapper[4699]: I1122 04:26:09.225330 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed96c0b0-7b76-4f03-b352-461405bbfb23" containerName="swift-ring-rebalance" Nov 22 04:26:09 crc kubenswrapper[4699]: I1122 04:26:09.225547 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e5d51d5-3dbb-4375-b090-b588f373647b" containerName="ovn-config" Nov 22 04:26:09 crc kubenswrapper[4699]: I1122 04:26:09.225565 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="feebe11e-01d4-44f9-a95d-9b35d3162cfd" containerName="glance-db-sync" Nov 22 04:26:09 crc kubenswrapper[4699]: I1122 04:26:09.225585 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed96c0b0-7b76-4f03-b352-461405bbfb23" containerName="swift-ring-rebalance" Nov 22 04:26:09 crc kubenswrapper[4699]: I1122 04:26:09.226715 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-npt6m" Nov 22 04:26:09 crc kubenswrapper[4699]: I1122 04:26:09.252320 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-npt6m"] Nov 22 04:26:09 crc kubenswrapper[4699]: I1122 04:26:09.380577 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1321f446-0e56-42d5-9f7c-f01f9261da44-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-npt6m\" (UID: \"1321f446-0e56-42d5-9f7c-f01f9261da44\") " pod="openstack/dnsmasq-dns-5b946c75cc-npt6m" Nov 22 04:26:09 crc kubenswrapper[4699]: I1122 04:26:09.380837 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1321f446-0e56-42d5-9f7c-f01f9261da44-config\") pod \"dnsmasq-dns-5b946c75cc-npt6m\" (UID: \"1321f446-0e56-42d5-9f7c-f01f9261da44\") " pod="openstack/dnsmasq-dns-5b946c75cc-npt6m" Nov 22 04:26:09 crc kubenswrapper[4699]: I1122 04:26:09.380894 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1321f446-0e56-42d5-9f7c-f01f9261da44-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-npt6m\" (UID: \"1321f446-0e56-42d5-9f7c-f01f9261da44\") " pod="openstack/dnsmasq-dns-5b946c75cc-npt6m" Nov 22 04:26:09 crc kubenswrapper[4699]: I1122 04:26:09.380919 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2gzv\" (UniqueName: \"kubernetes.io/projected/1321f446-0e56-42d5-9f7c-f01f9261da44-kube-api-access-g2gzv\") pod \"dnsmasq-dns-5b946c75cc-npt6m\" (UID: \"1321f446-0e56-42d5-9f7c-f01f9261da44\") " pod="openstack/dnsmasq-dns-5b946c75cc-npt6m" Nov 22 04:26:09 crc kubenswrapper[4699]: I1122 04:26:09.380961 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1321f446-0e56-42d5-9f7c-f01f9261da44-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-npt6m\" (UID: \"1321f446-0e56-42d5-9f7c-f01f9261da44\") " pod="openstack/dnsmasq-dns-5b946c75cc-npt6m" Nov 22 04:26:09 crc kubenswrapper[4699]: I1122 04:26:09.482849 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1321f446-0e56-42d5-9f7c-f01f9261da44-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-npt6m\" (UID: \"1321f446-0e56-42d5-9f7c-f01f9261da44\") " pod="openstack/dnsmasq-dns-5b946c75cc-npt6m" Nov 22 04:26:09 crc kubenswrapper[4699]: I1122 04:26:09.482968 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1321f446-0e56-42d5-9f7c-f01f9261da44-config\") pod \"dnsmasq-dns-5b946c75cc-npt6m\" (UID: \"1321f446-0e56-42d5-9f7c-f01f9261da44\") " pod="openstack/dnsmasq-dns-5b946c75cc-npt6m" Nov 22 04:26:09 crc kubenswrapper[4699]: I1122 04:26:09.482997 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1321f446-0e56-42d5-9f7c-f01f9261da44-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-npt6m\" (UID: \"1321f446-0e56-42d5-9f7c-f01f9261da44\") " pod="openstack/dnsmasq-dns-5b946c75cc-npt6m" Nov 22 04:26:09 crc kubenswrapper[4699]: I1122 04:26:09.483054 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2gzv\" (UniqueName: \"kubernetes.io/projected/1321f446-0e56-42d5-9f7c-f01f9261da44-kube-api-access-g2gzv\") pod \"dnsmasq-dns-5b946c75cc-npt6m\" (UID: \"1321f446-0e56-42d5-9f7c-f01f9261da44\") " pod="openstack/dnsmasq-dns-5b946c75cc-npt6m" Nov 22 04:26:09 crc kubenswrapper[4699]: I1122 04:26:09.483091 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1321f446-0e56-42d5-9f7c-f01f9261da44-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-npt6m\" (UID: \"1321f446-0e56-42d5-9f7c-f01f9261da44\") " pod="openstack/dnsmasq-dns-5b946c75cc-npt6m" Nov 22 04:26:09 crc kubenswrapper[4699]: I1122 04:26:09.484011 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1321f446-0e56-42d5-9f7c-f01f9261da44-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-npt6m\" (UID: \"1321f446-0e56-42d5-9f7c-f01f9261da44\") " pod="openstack/dnsmasq-dns-5b946c75cc-npt6m" Nov 22 04:26:09 crc kubenswrapper[4699]: I1122 04:26:09.484075 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1321f446-0e56-42d5-9f7c-f01f9261da44-config\") pod \"dnsmasq-dns-5b946c75cc-npt6m\" (UID: \"1321f446-0e56-42d5-9f7c-f01f9261da44\") " pod="openstack/dnsmasq-dns-5b946c75cc-npt6m" Nov 22 04:26:09 crc kubenswrapper[4699]: I1122 04:26:09.484074 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1321f446-0e56-42d5-9f7c-f01f9261da44-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-npt6m\" (UID: \"1321f446-0e56-42d5-9f7c-f01f9261da44\") " pod="openstack/dnsmasq-dns-5b946c75cc-npt6m" Nov 22 04:26:09 crc kubenswrapper[4699]: I1122 04:26:09.484454 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1321f446-0e56-42d5-9f7c-f01f9261da44-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-npt6m\" (UID: \"1321f446-0e56-42d5-9f7c-f01f9261da44\") " pod="openstack/dnsmasq-dns-5b946c75cc-npt6m" Nov 22 04:26:09 crc kubenswrapper[4699]: I1122 04:26:09.505573 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2gzv\" (UniqueName: \"kubernetes.io/projected/1321f446-0e56-42d5-9f7c-f01f9261da44-kube-api-access-g2gzv\") pod \"dnsmasq-dns-5b946c75cc-npt6m\" (UID: \"1321f446-0e56-42d5-9f7c-f01f9261da44\") " pod="openstack/dnsmasq-dns-5b946c75cc-npt6m" Nov 22 04:26:09 crc kubenswrapper[4699]: I1122 04:26:09.608971 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-npt6m" Nov 22 04:26:09 crc kubenswrapper[4699]: I1122 04:26:09.839304 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ff24634a-9171-4a5c-b045-4c653e032c18","Type":"ContainerStarted","Data":"cedb3cb2eb674426a42122ff4d753308509d678617d66ec4f1398bba698795df"} Nov 22 04:26:09 crc kubenswrapper[4699]: I1122 04:26:09.839677 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ff24634a-9171-4a5c-b045-4c653e032c18","Type":"ContainerStarted","Data":"6836115c994f94fee5b7b52e1d8892d32bd9aa623809a97c6cc766f5058af963"} Nov 22 04:26:09 crc kubenswrapper[4699]: I1122 04:26:09.874821 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=36.680950911 podStartE2EDuration="41.874801626s" podCreationTimestamp="2025-11-22 04:25:28 +0000 UTC" firstStartedPulling="2025-11-22 04:26:02.554021511 +0000 UTC m=+1113.896642698" lastFinishedPulling="2025-11-22 04:26:07.747872226 +0000 UTC m=+1119.090493413" observedRunningTime="2025-11-22 04:26:09.872717796 +0000 UTC m=+1121.215339003" watchObservedRunningTime="2025-11-22 04:26:09.874801626 +0000 UTC m=+1121.217422823" Nov 22 04:26:10 crc kubenswrapper[4699]: I1122 04:26:10.045700 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-npt6m"] Nov 22 04:26:10 crc kubenswrapper[4699]: I1122 04:26:10.141130 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-npt6m"] Nov 22 04:26:10 crc kubenswrapper[4699]: I1122 04:26:10.172943 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-x5pmx"] Nov 22 04:26:10 crc kubenswrapper[4699]: I1122 04:26:10.174662 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-x5pmx" Nov 22 04:26:10 crc kubenswrapper[4699]: I1122 04:26:10.177172 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Nov 22 04:26:10 crc kubenswrapper[4699]: I1122 04:26:10.182506 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-x5pmx"] Nov 22 04:26:10 crc kubenswrapper[4699]: I1122 04:26:10.303963 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/417b0282-cef1-4a7c-aca5-593297254fe3-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-x5pmx\" (UID: \"417b0282-cef1-4a7c-aca5-593297254fe3\") " pod="openstack/dnsmasq-dns-74f6bcbc87-x5pmx" Nov 22 04:26:10 crc kubenswrapper[4699]: I1122 04:26:10.305176 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/417b0282-cef1-4a7c-aca5-593297254fe3-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-x5pmx\" (UID: \"417b0282-cef1-4a7c-aca5-593297254fe3\") " pod="openstack/dnsmasq-dns-74f6bcbc87-x5pmx" Nov 22 04:26:10 crc kubenswrapper[4699]: I1122 04:26:10.305279 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/417b0282-cef1-4a7c-aca5-593297254fe3-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-x5pmx\" (UID: \"417b0282-cef1-4a7c-aca5-593297254fe3\") " pod="openstack/dnsmasq-dns-74f6bcbc87-x5pmx" Nov 22 04:26:10 crc kubenswrapper[4699]: I1122 04:26:10.305400 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/417b0282-cef1-4a7c-aca5-593297254fe3-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-x5pmx\" (UID: \"417b0282-cef1-4a7c-aca5-593297254fe3\") " pod="openstack/dnsmasq-dns-74f6bcbc87-x5pmx" Nov 22 04:26:10 crc kubenswrapper[4699]: I1122 04:26:10.305575 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnzsj\" (UniqueName: \"kubernetes.io/projected/417b0282-cef1-4a7c-aca5-593297254fe3-kube-api-access-pnzsj\") pod \"dnsmasq-dns-74f6bcbc87-x5pmx\" (UID: \"417b0282-cef1-4a7c-aca5-593297254fe3\") " pod="openstack/dnsmasq-dns-74f6bcbc87-x5pmx" Nov 22 04:26:10 crc kubenswrapper[4699]: I1122 04:26:10.305674 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/417b0282-cef1-4a7c-aca5-593297254fe3-config\") pod \"dnsmasq-dns-74f6bcbc87-x5pmx\" (UID: \"417b0282-cef1-4a7c-aca5-593297254fe3\") " pod="openstack/dnsmasq-dns-74f6bcbc87-x5pmx" Nov 22 04:26:10 crc kubenswrapper[4699]: I1122 04:26:10.407081 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/417b0282-cef1-4a7c-aca5-593297254fe3-config\") pod \"dnsmasq-dns-74f6bcbc87-x5pmx\" (UID: \"417b0282-cef1-4a7c-aca5-593297254fe3\") " pod="openstack/dnsmasq-dns-74f6bcbc87-x5pmx" Nov 22 04:26:10 crc kubenswrapper[4699]: I1122 04:26:10.407158 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/417b0282-cef1-4a7c-aca5-593297254fe3-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-x5pmx\" (UID: \"417b0282-cef1-4a7c-aca5-593297254fe3\") " pod="openstack/dnsmasq-dns-74f6bcbc87-x5pmx" Nov 22 04:26:10 crc kubenswrapper[4699]: I1122 04:26:10.407207 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/417b0282-cef1-4a7c-aca5-593297254fe3-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-x5pmx\" (UID: \"417b0282-cef1-4a7c-aca5-593297254fe3\") " pod="openstack/dnsmasq-dns-74f6bcbc87-x5pmx" Nov 22 04:26:10 crc kubenswrapper[4699]: I1122 04:26:10.407238 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/417b0282-cef1-4a7c-aca5-593297254fe3-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-x5pmx\" (UID: \"417b0282-cef1-4a7c-aca5-593297254fe3\") " pod="openstack/dnsmasq-dns-74f6bcbc87-x5pmx" Nov 22 04:26:10 crc kubenswrapper[4699]: I1122 04:26:10.407284 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/417b0282-cef1-4a7c-aca5-593297254fe3-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-x5pmx\" (UID: \"417b0282-cef1-4a7c-aca5-593297254fe3\") " pod="openstack/dnsmasq-dns-74f6bcbc87-x5pmx" Nov 22 04:26:10 crc kubenswrapper[4699]: I1122 04:26:10.407309 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnzsj\" (UniqueName: \"kubernetes.io/projected/417b0282-cef1-4a7c-aca5-593297254fe3-kube-api-access-pnzsj\") pod \"dnsmasq-dns-74f6bcbc87-x5pmx\" (UID: \"417b0282-cef1-4a7c-aca5-593297254fe3\") " pod="openstack/dnsmasq-dns-74f6bcbc87-x5pmx" Nov 22 04:26:10 crc kubenswrapper[4699]: I1122 04:26:10.408064 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/417b0282-cef1-4a7c-aca5-593297254fe3-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-x5pmx\" (UID: \"417b0282-cef1-4a7c-aca5-593297254fe3\") " pod="openstack/dnsmasq-dns-74f6bcbc87-x5pmx" Nov 22 04:26:10 crc kubenswrapper[4699]: I1122 04:26:10.408130 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/417b0282-cef1-4a7c-aca5-593297254fe3-config\") pod \"dnsmasq-dns-74f6bcbc87-x5pmx\" (UID: \"417b0282-cef1-4a7c-aca5-593297254fe3\") " pod="openstack/dnsmasq-dns-74f6bcbc87-x5pmx" Nov 22 04:26:10 crc kubenswrapper[4699]: I1122 04:26:10.408177 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/417b0282-cef1-4a7c-aca5-593297254fe3-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-x5pmx\" (UID: \"417b0282-cef1-4a7c-aca5-593297254fe3\") " pod="openstack/dnsmasq-dns-74f6bcbc87-x5pmx" Nov 22 04:26:10 crc kubenswrapper[4699]: I1122 04:26:10.408559 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/417b0282-cef1-4a7c-aca5-593297254fe3-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-x5pmx\" (UID: \"417b0282-cef1-4a7c-aca5-593297254fe3\") " pod="openstack/dnsmasq-dns-74f6bcbc87-x5pmx" Nov 22 04:26:10 crc kubenswrapper[4699]: I1122 04:26:10.408560 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/417b0282-cef1-4a7c-aca5-593297254fe3-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-x5pmx\" (UID: \"417b0282-cef1-4a7c-aca5-593297254fe3\") " pod="openstack/dnsmasq-dns-74f6bcbc87-x5pmx" Nov 22 04:26:10 crc kubenswrapper[4699]: I1122 04:26:10.432083 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnzsj\" (UniqueName: \"kubernetes.io/projected/417b0282-cef1-4a7c-aca5-593297254fe3-kube-api-access-pnzsj\") pod \"dnsmasq-dns-74f6bcbc87-x5pmx\" (UID: \"417b0282-cef1-4a7c-aca5-593297254fe3\") " pod="openstack/dnsmasq-dns-74f6bcbc87-x5pmx" Nov 22 04:26:10 crc kubenswrapper[4699]: I1122 04:26:10.530648 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-x5pmx" Nov 22 04:26:10 crc kubenswrapper[4699]: I1122 04:26:10.758612 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-x5pmx"] Nov 22 04:26:10 crc kubenswrapper[4699]: W1122 04:26:10.764106 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod417b0282_cef1_4a7c_aca5_593297254fe3.slice/crio-d9be3bb7fa7a0781dffe600d9a6395a984bfdc0273d8e5fcae602ef27fcca7e9 WatchSource:0}: Error finding container d9be3bb7fa7a0781dffe600d9a6395a984bfdc0273d8e5fcae602ef27fcca7e9: Status 404 returned error can't find the container with id d9be3bb7fa7a0781dffe600d9a6395a984bfdc0273d8e5fcae602ef27fcca7e9 Nov 22 04:26:10 crc kubenswrapper[4699]: I1122 04:26:10.850858 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-x5pmx" event={"ID":"417b0282-cef1-4a7c-aca5-593297254fe3","Type":"ContainerStarted","Data":"d9be3bb7fa7a0781dffe600d9a6395a984bfdc0273d8e5fcae602ef27fcca7e9"} Nov 22 04:26:10 crc kubenswrapper[4699]: I1122 04:26:10.852605 4699 generic.go:334] "Generic (PLEG): container finished" podID="1321f446-0e56-42d5-9f7c-f01f9261da44" containerID="cac2499b9d44220e20f66682719666bba04a4034a530fb489d93279d4debe170" exitCode=0 Nov 22 04:26:10 crc kubenswrapper[4699]: I1122 04:26:10.853816 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-npt6m" event={"ID":"1321f446-0e56-42d5-9f7c-f01f9261da44","Type":"ContainerDied","Data":"cac2499b9d44220e20f66682719666bba04a4034a530fb489d93279d4debe170"} Nov 22 04:26:10 crc kubenswrapper[4699]: I1122 04:26:10.853844 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-npt6m" event={"ID":"1321f446-0e56-42d5-9f7c-f01f9261da44","Type":"ContainerStarted","Data":"2b07e9369560a686166f24ec5fcd22c51b5c9b94abb1d6a258c6dbc592a4432a"} Nov 22 04:26:11 crc kubenswrapper[4699]: I1122 04:26:11.166106 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-npt6m" Nov 22 04:26:11 crc kubenswrapper[4699]: I1122 04:26:11.221877 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1321f446-0e56-42d5-9f7c-f01f9261da44-config\") pod \"1321f446-0e56-42d5-9f7c-f01f9261da44\" (UID: \"1321f446-0e56-42d5-9f7c-f01f9261da44\") " Nov 22 04:26:11 crc kubenswrapper[4699]: I1122 04:26:11.221929 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1321f446-0e56-42d5-9f7c-f01f9261da44-dns-svc\") pod \"1321f446-0e56-42d5-9f7c-f01f9261da44\" (UID: \"1321f446-0e56-42d5-9f7c-f01f9261da44\") " Nov 22 04:26:11 crc kubenswrapper[4699]: I1122 04:26:11.221985 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2gzv\" (UniqueName: \"kubernetes.io/projected/1321f446-0e56-42d5-9f7c-f01f9261da44-kube-api-access-g2gzv\") pod \"1321f446-0e56-42d5-9f7c-f01f9261da44\" (UID: \"1321f446-0e56-42d5-9f7c-f01f9261da44\") " Nov 22 04:26:11 crc kubenswrapper[4699]: I1122 04:26:11.222024 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1321f446-0e56-42d5-9f7c-f01f9261da44-ovsdbserver-nb\") pod \"1321f446-0e56-42d5-9f7c-f01f9261da44\" (UID: \"1321f446-0e56-42d5-9f7c-f01f9261da44\") " Nov 22 04:26:11 crc kubenswrapper[4699]: I1122 04:26:11.227166 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1321f446-0e56-42d5-9f7c-f01f9261da44-kube-api-access-g2gzv" (OuterVolumeSpecName: "kube-api-access-g2gzv") pod "1321f446-0e56-42d5-9f7c-f01f9261da44" (UID: "1321f446-0e56-42d5-9f7c-f01f9261da44"). InnerVolumeSpecName "kube-api-access-g2gzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:26:11 crc kubenswrapper[4699]: I1122 04:26:11.243287 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1321f446-0e56-42d5-9f7c-f01f9261da44-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1321f446-0e56-42d5-9f7c-f01f9261da44" (UID: "1321f446-0e56-42d5-9f7c-f01f9261da44"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:26:11 crc kubenswrapper[4699]: I1122 04:26:11.247754 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1321f446-0e56-42d5-9f7c-f01f9261da44-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1321f446-0e56-42d5-9f7c-f01f9261da44" (UID: "1321f446-0e56-42d5-9f7c-f01f9261da44"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:26:11 crc kubenswrapper[4699]: I1122 04:26:11.248494 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1321f446-0e56-42d5-9f7c-f01f9261da44-config" (OuterVolumeSpecName: "config") pod "1321f446-0e56-42d5-9f7c-f01f9261da44" (UID: "1321f446-0e56-42d5-9f7c-f01f9261da44"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:26:11 crc kubenswrapper[4699]: I1122 04:26:11.323845 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1321f446-0e56-42d5-9f7c-f01f9261da44-ovsdbserver-sb\") pod \"1321f446-0e56-42d5-9f7c-f01f9261da44\" (UID: \"1321f446-0e56-42d5-9f7c-f01f9261da44\") " Nov 22 04:26:11 crc kubenswrapper[4699]: I1122 04:26:11.324465 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2gzv\" (UniqueName: \"kubernetes.io/projected/1321f446-0e56-42d5-9f7c-f01f9261da44-kube-api-access-g2gzv\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:11 crc kubenswrapper[4699]: I1122 04:26:11.324476 4699 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1321f446-0e56-42d5-9f7c-f01f9261da44-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:11 crc kubenswrapper[4699]: I1122 04:26:11.324486 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1321f446-0e56-42d5-9f7c-f01f9261da44-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:11 crc kubenswrapper[4699]: I1122 04:26:11.324494 4699 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1321f446-0e56-42d5-9f7c-f01f9261da44-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:11 crc kubenswrapper[4699]: I1122 04:26:11.344686 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1321f446-0e56-42d5-9f7c-f01f9261da44-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1321f446-0e56-42d5-9f7c-f01f9261da44" (UID: "1321f446-0e56-42d5-9f7c-f01f9261da44"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:26:11 crc kubenswrapper[4699]: I1122 04:26:11.429689 4699 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1321f446-0e56-42d5-9f7c-f01f9261da44-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:11 crc kubenswrapper[4699]: I1122 04:26:11.860741 4699 generic.go:334] "Generic (PLEG): container finished" podID="417b0282-cef1-4a7c-aca5-593297254fe3" containerID="e19911f188c7fc263ab27377da1b53e1a0be2f60dcb34322a9569882b57b1755" exitCode=0 Nov 22 04:26:11 crc kubenswrapper[4699]: I1122 04:26:11.860804 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-x5pmx" event={"ID":"417b0282-cef1-4a7c-aca5-593297254fe3","Type":"ContainerDied","Data":"e19911f188c7fc263ab27377da1b53e1a0be2f60dcb34322a9569882b57b1755"} Nov 22 04:26:11 crc kubenswrapper[4699]: I1122 04:26:11.862874 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-npt6m" event={"ID":"1321f446-0e56-42d5-9f7c-f01f9261da44","Type":"ContainerDied","Data":"2b07e9369560a686166f24ec5fcd22c51b5c9b94abb1d6a258c6dbc592a4432a"} Nov 22 04:26:11 crc kubenswrapper[4699]: I1122 04:26:11.862901 4699 scope.go:117] "RemoveContainer" containerID="cac2499b9d44220e20f66682719666bba04a4034a530fb489d93279d4debe170" Nov 22 04:26:11 crc kubenswrapper[4699]: I1122 04:26:11.862987 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-npt6m" Nov 22 04:26:11 crc kubenswrapper[4699]: I1122 04:26:11.924120 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-npt6m"] Nov 22 04:26:11 crc kubenswrapper[4699]: I1122 04:26:11.930751 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-npt6m"] Nov 22 04:26:12 crc kubenswrapper[4699]: I1122 04:26:12.874304 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-x5pmx" event={"ID":"417b0282-cef1-4a7c-aca5-593297254fe3","Type":"ContainerStarted","Data":"a1a5b0166aeaff548a75355e9677f3747a6bae4fbd895a37d195f1a79b30ca91"} Nov 22 04:26:12 crc kubenswrapper[4699]: I1122 04:26:12.874726 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-x5pmx" Nov 22 04:26:12 crc kubenswrapper[4699]: I1122 04:26:12.900122 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f6bcbc87-x5pmx" podStartSLOduration=2.900102644 podStartE2EDuration="2.900102644s" podCreationTimestamp="2025-11-22 04:26:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:26:12.895514223 +0000 UTC m=+1124.238135420" watchObservedRunningTime="2025-11-22 04:26:12.900102644 +0000 UTC m=+1124.242723841" Nov 22 04:26:13 crc kubenswrapper[4699]: I1122 04:26:13.420763 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Nov 22 04:26:13 crc kubenswrapper[4699]: I1122 04:26:13.463788 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1321f446-0e56-42d5-9f7c-f01f9261da44" path="/var/lib/kubelet/pods/1321f446-0e56-42d5-9f7c-f01f9261da44/volumes" Nov 22 04:26:13 crc kubenswrapper[4699]: I1122 04:26:13.726661 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:26:13 crc kubenswrapper[4699]: I1122 04:26:13.779686 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-xj7gh"] Nov 22 04:26:13 crc kubenswrapper[4699]: E1122 04:26:13.780157 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1321f446-0e56-42d5-9f7c-f01f9261da44" containerName="init" Nov 22 04:26:13 crc kubenswrapper[4699]: I1122 04:26:13.780194 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="1321f446-0e56-42d5-9f7c-f01f9261da44" containerName="init" Nov 22 04:26:13 crc kubenswrapper[4699]: I1122 04:26:13.780408 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="1321f446-0e56-42d5-9f7c-f01f9261da44" containerName="init" Nov 22 04:26:13 crc kubenswrapper[4699]: I1122 04:26:13.781165 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xj7gh" Nov 22 04:26:13 crc kubenswrapper[4699]: I1122 04:26:13.788120 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-c223-account-create-gfsvr"] Nov 22 04:26:13 crc kubenswrapper[4699]: I1122 04:26:13.789295 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c223-account-create-gfsvr" Nov 22 04:26:13 crc kubenswrapper[4699]: I1122 04:26:13.790837 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Nov 22 04:26:13 crc kubenswrapper[4699]: I1122 04:26:13.814765 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-c223-account-create-gfsvr"] Nov 22 04:26:13 crc kubenswrapper[4699]: I1122 04:26:13.824794 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-xj7gh"] Nov 22 04:26:13 crc kubenswrapper[4699]: I1122 04:26:13.873374 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8bcs\" (UniqueName: \"kubernetes.io/projected/ece646f4-4f02-4a0d-9dde-ffb3d913410e-kube-api-access-m8bcs\") pod \"cinder-c223-account-create-gfsvr\" (UID: \"ece646f4-4f02-4a0d-9dde-ffb3d913410e\") " pod="openstack/cinder-c223-account-create-gfsvr" Nov 22 04:26:13 crc kubenswrapper[4699]: I1122 04:26:13.873461 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67126a3c-ec10-4f12-96ad-0133fcabb75f-operator-scripts\") pod \"cinder-db-create-xj7gh\" (UID: \"67126a3c-ec10-4f12-96ad-0133fcabb75f\") " pod="openstack/cinder-db-create-xj7gh" Nov 22 04:26:13 crc kubenswrapper[4699]: I1122 04:26:13.873549 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqrk2\" (UniqueName: \"kubernetes.io/projected/67126a3c-ec10-4f12-96ad-0133fcabb75f-kube-api-access-vqrk2\") pod \"cinder-db-create-xj7gh\" (UID: \"67126a3c-ec10-4f12-96ad-0133fcabb75f\") " pod="openstack/cinder-db-create-xj7gh" Nov 22 04:26:13 crc kubenswrapper[4699]: I1122 04:26:13.873575 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ece646f4-4f02-4a0d-9dde-ffb3d913410e-operator-scripts\") pod \"cinder-c223-account-create-gfsvr\" (UID: \"ece646f4-4f02-4a0d-9dde-ffb3d913410e\") " pod="openstack/cinder-c223-account-create-gfsvr" Nov 22 04:26:13 crc kubenswrapper[4699]: I1122 04:26:13.875379 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-6kgwt"] Nov 22 04:26:13 crc kubenswrapper[4699]: I1122 04:26:13.876914 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6kgwt" Nov 22 04:26:13 crc kubenswrapper[4699]: I1122 04:26:13.890918 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-6kgwt"] Nov 22 04:26:13 crc kubenswrapper[4699]: I1122 04:26:13.970991 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-919f-account-create-zkhx4"] Nov 22 04:26:13 crc kubenswrapper[4699]: I1122 04:26:13.975385 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqrk2\" (UniqueName: \"kubernetes.io/projected/67126a3c-ec10-4f12-96ad-0133fcabb75f-kube-api-access-vqrk2\") pod \"cinder-db-create-xj7gh\" (UID: \"67126a3c-ec10-4f12-96ad-0133fcabb75f\") " pod="openstack/cinder-db-create-xj7gh" Nov 22 04:26:13 crc kubenswrapper[4699]: I1122 04:26:13.975450 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ece646f4-4f02-4a0d-9dde-ffb3d913410e-operator-scripts\") pod \"cinder-c223-account-create-gfsvr\" (UID: \"ece646f4-4f02-4a0d-9dde-ffb3d913410e\") " pod="openstack/cinder-c223-account-create-gfsvr" Nov 22 04:26:13 crc kubenswrapper[4699]: I1122 04:26:13.975490 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96c287a2-30d6-4055-aea8-d104dbd472c2-operator-scripts\") pod \"barbican-db-create-6kgwt\" (UID: \"96c287a2-30d6-4055-aea8-d104dbd472c2\") " pod="openstack/barbican-db-create-6kgwt" Nov 22 04:26:13 crc kubenswrapper[4699]: I1122 04:26:13.975575 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h2sw\" (UniqueName: \"kubernetes.io/projected/96c287a2-30d6-4055-aea8-d104dbd472c2-kube-api-access-7h2sw\") pod \"barbican-db-create-6kgwt\" (UID: \"96c287a2-30d6-4055-aea8-d104dbd472c2\") " pod="openstack/barbican-db-create-6kgwt" Nov 22 04:26:13 crc kubenswrapper[4699]: I1122 04:26:13.975716 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8bcs\" (UniqueName: \"kubernetes.io/projected/ece646f4-4f02-4a0d-9dde-ffb3d913410e-kube-api-access-m8bcs\") pod \"cinder-c223-account-create-gfsvr\" (UID: \"ece646f4-4f02-4a0d-9dde-ffb3d913410e\") " pod="openstack/cinder-c223-account-create-gfsvr" Nov 22 04:26:13 crc kubenswrapper[4699]: I1122 04:26:13.975771 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67126a3c-ec10-4f12-96ad-0133fcabb75f-operator-scripts\") pod \"cinder-db-create-xj7gh\" (UID: \"67126a3c-ec10-4f12-96ad-0133fcabb75f\") " pod="openstack/cinder-db-create-xj7gh" Nov 22 04:26:13 crc kubenswrapper[4699]: I1122 04:26:13.976814 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-919f-account-create-zkhx4" Nov 22 04:26:13 crc kubenswrapper[4699]: I1122 04:26:13.977369 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ece646f4-4f02-4a0d-9dde-ffb3d913410e-operator-scripts\") pod \"cinder-c223-account-create-gfsvr\" (UID: \"ece646f4-4f02-4a0d-9dde-ffb3d913410e\") " pod="openstack/cinder-c223-account-create-gfsvr" Nov 22 04:26:13 crc kubenswrapper[4699]: I1122 04:26:13.978289 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67126a3c-ec10-4f12-96ad-0133fcabb75f-operator-scripts\") pod \"cinder-db-create-xj7gh\" (UID: \"67126a3c-ec10-4f12-96ad-0133fcabb75f\") " pod="openstack/cinder-db-create-xj7gh" Nov 22 04:26:13 crc kubenswrapper[4699]: I1122 04:26:13.980101 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.003020 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-919f-account-create-zkhx4"] Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.006594 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8bcs\" (UniqueName: \"kubernetes.io/projected/ece646f4-4f02-4a0d-9dde-ffb3d913410e-kube-api-access-m8bcs\") pod \"cinder-c223-account-create-gfsvr\" (UID: \"ece646f4-4f02-4a0d-9dde-ffb3d913410e\") " pod="openstack/cinder-c223-account-create-gfsvr" Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.017038 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqrk2\" (UniqueName: \"kubernetes.io/projected/67126a3c-ec10-4f12-96ad-0133fcabb75f-kube-api-access-vqrk2\") pod \"cinder-db-create-xj7gh\" (UID: \"67126a3c-ec10-4f12-96ad-0133fcabb75f\") " pod="openstack/cinder-db-create-xj7gh" Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.066044 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-8xlt7"] Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.069408 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8xlt7" Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.073893 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.073956 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.074170 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-qfldf" Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.074230 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.077166 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96c287a2-30d6-4055-aea8-d104dbd472c2-operator-scripts\") pod \"barbican-db-create-6kgwt\" (UID: \"96c287a2-30d6-4055-aea8-d104dbd472c2\") " pod="openstack/barbican-db-create-6kgwt" Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.077278 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h2sw\" (UniqueName: \"kubernetes.io/projected/96c287a2-30d6-4055-aea8-d104dbd472c2-kube-api-access-7h2sw\") pod \"barbican-db-create-6kgwt\" (UID: \"96c287a2-30d6-4055-aea8-d104dbd472c2\") " pod="openstack/barbican-db-create-6kgwt" Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.077345 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/479c04e1-21fc-4674-98ac-3abc9ba96b34-operator-scripts\") pod \"barbican-919f-account-create-zkhx4\" (UID: \"479c04e1-21fc-4674-98ac-3abc9ba96b34\") " pod="openstack/barbican-919f-account-create-zkhx4" Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.077407 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znz52\" (UniqueName: \"kubernetes.io/projected/479c04e1-21fc-4674-98ac-3abc9ba96b34-kube-api-access-znz52\") pod \"barbican-919f-account-create-zkhx4\" (UID: \"479c04e1-21fc-4674-98ac-3abc9ba96b34\") " pod="openstack/barbican-919f-account-create-zkhx4" Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.078082 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96c287a2-30d6-4055-aea8-d104dbd472c2-operator-scripts\") pod \"barbican-db-create-6kgwt\" (UID: \"96c287a2-30d6-4055-aea8-d104dbd472c2\") " pod="openstack/barbican-db-create-6kgwt" Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.126965 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-8xlt7"] Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.137521 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-jrsxw"] Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.138769 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-jrsxw" Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.140562 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xj7gh" Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.145568 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h2sw\" (UniqueName: \"kubernetes.io/projected/96c287a2-30d6-4055-aea8-d104dbd472c2-kube-api-access-7h2sw\") pod \"barbican-db-create-6kgwt\" (UID: \"96c287a2-30d6-4055-aea8-d104dbd472c2\") " pod="openstack/barbican-db-create-6kgwt" Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.158221 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-jrsxw"] Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.163417 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c223-account-create-gfsvr" Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.169482 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5ac7-account-create-jmn9n"] Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.171042 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5ac7-account-create-jmn9n" Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.175089 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.177545 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5ac7-account-create-jmn9n"] Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.178456 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znz52\" (UniqueName: \"kubernetes.io/projected/479c04e1-21fc-4674-98ac-3abc9ba96b34-kube-api-access-znz52\") pod \"barbican-919f-account-create-zkhx4\" (UID: \"479c04e1-21fc-4674-98ac-3abc9ba96b34\") " pod="openstack/barbican-919f-account-create-zkhx4" Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.178527 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34007e0a-511e-41f1-b3fc-810d7911d11c-config-data\") pod \"keystone-db-sync-8xlt7\" (UID: \"34007e0a-511e-41f1-b3fc-810d7911d11c\") " pod="openstack/keystone-db-sync-8xlt7" Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.178587 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fm45j\" (UniqueName: \"kubernetes.io/projected/34007e0a-511e-41f1-b3fc-810d7911d11c-kube-api-access-fm45j\") pod \"keystone-db-sync-8xlt7\" (UID: \"34007e0a-511e-41f1-b3fc-810d7911d11c\") " pod="openstack/keystone-db-sync-8xlt7" Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.178655 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34007e0a-511e-41f1-b3fc-810d7911d11c-combined-ca-bundle\") pod \"keystone-db-sync-8xlt7\" (UID: \"34007e0a-511e-41f1-b3fc-810d7911d11c\") " pod="openstack/keystone-db-sync-8xlt7" Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.178693 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/479c04e1-21fc-4674-98ac-3abc9ba96b34-operator-scripts\") pod \"barbican-919f-account-create-zkhx4\" (UID: \"479c04e1-21fc-4674-98ac-3abc9ba96b34\") " pod="openstack/barbican-919f-account-create-zkhx4" Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.181345 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/479c04e1-21fc-4674-98ac-3abc9ba96b34-operator-scripts\") pod \"barbican-919f-account-create-zkhx4\" (UID: \"479c04e1-21fc-4674-98ac-3abc9ba96b34\") " pod="openstack/barbican-919f-account-create-zkhx4" Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.203088 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6kgwt" Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.206909 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znz52\" (UniqueName: \"kubernetes.io/projected/479c04e1-21fc-4674-98ac-3abc9ba96b34-kube-api-access-znz52\") pod \"barbican-919f-account-create-zkhx4\" (UID: \"479c04e1-21fc-4674-98ac-3abc9ba96b34\") " pod="openstack/barbican-919f-account-create-zkhx4" Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.280346 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddb8d79c-fba7-489b-8953-f507deb03a03-operator-scripts\") pod \"neutron-5ac7-account-create-jmn9n\" (UID: \"ddb8d79c-fba7-489b-8953-f507deb03a03\") " pod="openstack/neutron-5ac7-account-create-jmn9n" Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.280406 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34007e0a-511e-41f1-b3fc-810d7911d11c-combined-ca-bundle\") pod \"keystone-db-sync-8xlt7\" (UID: \"34007e0a-511e-41f1-b3fc-810d7911d11c\") " pod="openstack/keystone-db-sync-8xlt7" Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.280478 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ct92\" (UniqueName: \"kubernetes.io/projected/ddb8d79c-fba7-489b-8953-f507deb03a03-kube-api-access-7ct92\") pod \"neutron-5ac7-account-create-jmn9n\" (UID: \"ddb8d79c-fba7-489b-8953-f507deb03a03\") " pod="openstack/neutron-5ac7-account-create-jmn9n" Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.280503 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m5q4\" (UniqueName: \"kubernetes.io/projected/7817603d-cfdf-425d-84de-095ff5b5674e-kube-api-access-2m5q4\") pod \"neutron-db-create-jrsxw\" (UID: \"7817603d-cfdf-425d-84de-095ff5b5674e\") " pod="openstack/neutron-db-create-jrsxw" Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.280535 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7817603d-cfdf-425d-84de-095ff5b5674e-operator-scripts\") pod \"neutron-db-create-jrsxw\" (UID: \"7817603d-cfdf-425d-84de-095ff5b5674e\") " pod="openstack/neutron-db-create-jrsxw" Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.280567 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34007e0a-511e-41f1-b3fc-810d7911d11c-config-data\") pod \"keystone-db-sync-8xlt7\" (UID: \"34007e0a-511e-41f1-b3fc-810d7911d11c\") " pod="openstack/keystone-db-sync-8xlt7" Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.280605 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fm45j\" (UniqueName: \"kubernetes.io/projected/34007e0a-511e-41f1-b3fc-810d7911d11c-kube-api-access-fm45j\") pod \"keystone-db-sync-8xlt7\" (UID: \"34007e0a-511e-41f1-b3fc-810d7911d11c\") " pod="openstack/keystone-db-sync-8xlt7" Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.285493 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34007e0a-511e-41f1-b3fc-810d7911d11c-config-data\") pod \"keystone-db-sync-8xlt7\" (UID: \"34007e0a-511e-41f1-b3fc-810d7911d11c\") " pod="openstack/keystone-db-sync-8xlt7" Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.289891 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34007e0a-511e-41f1-b3fc-810d7911d11c-combined-ca-bundle\") pod \"keystone-db-sync-8xlt7\" (UID: \"34007e0a-511e-41f1-b3fc-810d7911d11c\") " pod="openstack/keystone-db-sync-8xlt7" Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.293994 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-919f-account-create-zkhx4" Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.303925 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fm45j\" (UniqueName: \"kubernetes.io/projected/34007e0a-511e-41f1-b3fc-810d7911d11c-kube-api-access-fm45j\") pod \"keystone-db-sync-8xlt7\" (UID: \"34007e0a-511e-41f1-b3fc-810d7911d11c\") " pod="openstack/keystone-db-sync-8xlt7" Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.381700 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddb8d79c-fba7-489b-8953-f507deb03a03-operator-scripts\") pod \"neutron-5ac7-account-create-jmn9n\" (UID: \"ddb8d79c-fba7-489b-8953-f507deb03a03\") " pod="openstack/neutron-5ac7-account-create-jmn9n" Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.382039 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ct92\" (UniqueName: \"kubernetes.io/projected/ddb8d79c-fba7-489b-8953-f507deb03a03-kube-api-access-7ct92\") pod \"neutron-5ac7-account-create-jmn9n\" (UID: \"ddb8d79c-fba7-489b-8953-f507deb03a03\") " pod="openstack/neutron-5ac7-account-create-jmn9n" Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.382065 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m5q4\" (UniqueName: \"kubernetes.io/projected/7817603d-cfdf-425d-84de-095ff5b5674e-kube-api-access-2m5q4\") pod \"neutron-db-create-jrsxw\" (UID: \"7817603d-cfdf-425d-84de-095ff5b5674e\") " pod="openstack/neutron-db-create-jrsxw" Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.382110 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7817603d-cfdf-425d-84de-095ff5b5674e-operator-scripts\") pod \"neutron-db-create-jrsxw\" (UID: \"7817603d-cfdf-425d-84de-095ff5b5674e\") " pod="openstack/neutron-db-create-jrsxw" Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.382823 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7817603d-cfdf-425d-84de-095ff5b5674e-operator-scripts\") pod \"neutron-db-create-jrsxw\" (UID: \"7817603d-cfdf-425d-84de-095ff5b5674e\") " pod="openstack/neutron-db-create-jrsxw" Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.389617 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddb8d79c-fba7-489b-8953-f507deb03a03-operator-scripts\") pod \"neutron-5ac7-account-create-jmn9n\" (UID: \"ddb8d79c-fba7-489b-8953-f507deb03a03\") " pod="openstack/neutron-5ac7-account-create-jmn9n" Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.403109 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m5q4\" (UniqueName: \"kubernetes.io/projected/7817603d-cfdf-425d-84de-095ff5b5674e-kube-api-access-2m5q4\") pod \"neutron-db-create-jrsxw\" (UID: \"7817603d-cfdf-425d-84de-095ff5b5674e\") " pod="openstack/neutron-db-create-jrsxw" Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.404328 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ct92\" (UniqueName: \"kubernetes.io/projected/ddb8d79c-fba7-489b-8953-f507deb03a03-kube-api-access-7ct92\") pod \"neutron-5ac7-account-create-jmn9n\" (UID: \"ddb8d79c-fba7-489b-8953-f507deb03a03\") " pod="openstack/neutron-5ac7-account-create-jmn9n" Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.460947 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8xlt7" Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.485069 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-xj7gh"] Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.567822 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-jrsxw" Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.575068 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5ac7-account-create-jmn9n" Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.738394 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-919f-account-create-zkhx4"] Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.762560 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-c223-account-create-gfsvr"] Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.769799 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-6kgwt"] Nov 22 04:26:14 crc kubenswrapper[4699]: W1122 04:26:14.780190 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podece646f4_4f02_4a0d_9dde_ffb3d913410e.slice/crio-e554b943d6baeacdd6da901d3f233cb64b6703f11b21c13e20b32c7b3adc72cc WatchSource:0}: Error finding container e554b943d6baeacdd6da901d3f233cb64b6703f11b21c13e20b32c7b3adc72cc: Status 404 returned error can't find the container with id e554b943d6baeacdd6da901d3f233cb64b6703f11b21c13e20b32c7b3adc72cc Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.899818 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6kgwt" event={"ID":"96c287a2-30d6-4055-aea8-d104dbd472c2","Type":"ContainerStarted","Data":"2d939a5991230498c4fb3e3f03726eedd68a162dd545ddd264fa2cfae0ee8155"} Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.905151 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c223-account-create-gfsvr" event={"ID":"ece646f4-4f02-4a0d-9dde-ffb3d913410e","Type":"ContainerStarted","Data":"e554b943d6baeacdd6da901d3f233cb64b6703f11b21c13e20b32c7b3adc72cc"} Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.909622 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-919f-account-create-zkhx4" event={"ID":"479c04e1-21fc-4674-98ac-3abc9ba96b34","Type":"ContainerStarted","Data":"46e467a9d337e1b1a658a5f245635fd6530ed8ddf9bd2d710ee8056bca7204f0"} Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.918103 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-xj7gh" event={"ID":"67126a3c-ec10-4f12-96ad-0133fcabb75f","Type":"ContainerStarted","Data":"ef8020ddface36a37b2c7f54d500c076f405aba4a7d91d37d05a3debfe9dc74c"} Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.918154 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-xj7gh" event={"ID":"67126a3c-ec10-4f12-96ad-0133fcabb75f","Type":"ContainerStarted","Data":"91733fbcf60df505571b2c97a8bd3196b5cbb14ee29792a954ce43f57d875c2a"} Nov 22 04:26:14 crc kubenswrapper[4699]: I1122 04:26:14.937081 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-xj7gh" podStartSLOduration=1.937050618 podStartE2EDuration="1.937050618s" podCreationTimestamp="2025-11-22 04:26:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:26:14.932370095 +0000 UTC m=+1126.274991302" watchObservedRunningTime="2025-11-22 04:26:14.937050618 +0000 UTC m=+1126.279671805" Nov 22 04:26:15 crc kubenswrapper[4699]: W1122 04:26:15.124861 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34007e0a_511e_41f1_b3fc_810d7911d11c.slice/crio-ae3a5badede071d3425c341cd60bd78f5f81ef9c33bdafda809e2dc481d33efc WatchSource:0}: Error finding container ae3a5badede071d3425c341cd60bd78f5f81ef9c33bdafda809e2dc481d33efc: Status 404 returned error can't find the container with id ae3a5badede071d3425c341cd60bd78f5f81ef9c33bdafda809e2dc481d33efc Nov 22 04:26:15 crc kubenswrapper[4699]: I1122 04:26:15.130415 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-8xlt7"] Nov 22 04:26:15 crc kubenswrapper[4699]: I1122 04:26:15.147547 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-jrsxw"] Nov 22 04:26:15 crc kubenswrapper[4699]: W1122 04:26:15.154378 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7817603d_cfdf_425d_84de_095ff5b5674e.slice/crio-fcb384a83aa1d5abccb79b812da9f9970f9d6c7a42dff498a5c37188d68a391c WatchSource:0}: Error finding container fcb384a83aa1d5abccb79b812da9f9970f9d6c7a42dff498a5c37188d68a391c: Status 404 returned error can't find the container with id fcb384a83aa1d5abccb79b812da9f9970f9d6c7a42dff498a5c37188d68a391c Nov 22 04:26:15 crc kubenswrapper[4699]: I1122 04:26:15.193010 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5ac7-account-create-jmn9n"] Nov 22 04:26:15 crc kubenswrapper[4699]: I1122 04:26:15.926758 4699 generic.go:334] "Generic (PLEG): container finished" podID="67126a3c-ec10-4f12-96ad-0133fcabb75f" containerID="ef8020ddface36a37b2c7f54d500c076f405aba4a7d91d37d05a3debfe9dc74c" exitCode=0 Nov 22 04:26:15 crc kubenswrapper[4699]: I1122 04:26:15.926818 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-xj7gh" event={"ID":"67126a3c-ec10-4f12-96ad-0133fcabb75f","Type":"ContainerDied","Data":"ef8020ddface36a37b2c7f54d500c076f405aba4a7d91d37d05a3debfe9dc74c"} Nov 22 04:26:15 crc kubenswrapper[4699]: I1122 04:26:15.930120 4699 generic.go:334] "Generic (PLEG): container finished" podID="96c287a2-30d6-4055-aea8-d104dbd472c2" containerID="d583b2f70707a4953393b4533486d274bb9897646119ce4c6029c87a6316f924" exitCode=0 Nov 22 04:26:15 crc kubenswrapper[4699]: I1122 04:26:15.930171 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6kgwt" event={"ID":"96c287a2-30d6-4055-aea8-d104dbd472c2","Type":"ContainerDied","Data":"d583b2f70707a4953393b4533486d274bb9897646119ce4c6029c87a6316f924"} Nov 22 04:26:15 crc kubenswrapper[4699]: I1122 04:26:15.931654 4699 generic.go:334] "Generic (PLEG): container finished" podID="ddb8d79c-fba7-489b-8953-f507deb03a03" containerID="25797f2bcddd3b7b79c299632af5532e867769ad77643c3195487f0fc0e5ffac" exitCode=0 Nov 22 04:26:15 crc kubenswrapper[4699]: I1122 04:26:15.931693 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5ac7-account-create-jmn9n" event={"ID":"ddb8d79c-fba7-489b-8953-f507deb03a03","Type":"ContainerDied","Data":"25797f2bcddd3b7b79c299632af5532e867769ad77643c3195487f0fc0e5ffac"} Nov 22 04:26:15 crc kubenswrapper[4699]: I1122 04:26:15.931709 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5ac7-account-create-jmn9n" event={"ID":"ddb8d79c-fba7-489b-8953-f507deb03a03","Type":"ContainerStarted","Data":"00b90b9adaa6420f2ec3d998c68f71f5baf55b757f2905875fcddc3a9317eca4"} Nov 22 04:26:15 crc kubenswrapper[4699]: I1122 04:26:15.933212 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8xlt7" event={"ID":"34007e0a-511e-41f1-b3fc-810d7911d11c","Type":"ContainerStarted","Data":"ae3a5badede071d3425c341cd60bd78f5f81ef9c33bdafda809e2dc481d33efc"} Nov 22 04:26:15 crc kubenswrapper[4699]: I1122 04:26:15.937990 4699 generic.go:334] "Generic (PLEG): container finished" podID="ece646f4-4f02-4a0d-9dde-ffb3d913410e" containerID="98c064a9afd4a919e21ad54c156ab613a1bd117554fc82d598efa73401984447" exitCode=0 Nov 22 04:26:15 crc kubenswrapper[4699]: I1122 04:26:15.938138 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c223-account-create-gfsvr" event={"ID":"ece646f4-4f02-4a0d-9dde-ffb3d913410e","Type":"ContainerDied","Data":"98c064a9afd4a919e21ad54c156ab613a1bd117554fc82d598efa73401984447"} Nov 22 04:26:15 crc kubenswrapper[4699]: I1122 04:26:15.943020 4699 generic.go:334] "Generic (PLEG): container finished" podID="479c04e1-21fc-4674-98ac-3abc9ba96b34" containerID="1f0da47f8055f67e2b11ad450c78bd219e11f2e4b3dd06a155e2031ab3f1d002" exitCode=0 Nov 22 04:26:15 crc kubenswrapper[4699]: I1122 04:26:15.943088 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-919f-account-create-zkhx4" event={"ID":"479c04e1-21fc-4674-98ac-3abc9ba96b34","Type":"ContainerDied","Data":"1f0da47f8055f67e2b11ad450c78bd219e11f2e4b3dd06a155e2031ab3f1d002"} Nov 22 04:26:15 crc kubenswrapper[4699]: I1122 04:26:15.946985 4699 generic.go:334] "Generic (PLEG): container finished" podID="7817603d-cfdf-425d-84de-095ff5b5674e" containerID="7fe951a8339cdd4c0b84481d3d38e606bee6aae754cefa3921aea9b07129c6fc" exitCode=0 Nov 22 04:26:15 crc kubenswrapper[4699]: I1122 04:26:15.947032 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-jrsxw" event={"ID":"7817603d-cfdf-425d-84de-095ff5b5674e","Type":"ContainerDied","Data":"7fe951a8339cdd4c0b84481d3d38e606bee6aae754cefa3921aea9b07129c6fc"} Nov 22 04:26:15 crc kubenswrapper[4699]: I1122 04:26:15.947054 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-jrsxw" event={"ID":"7817603d-cfdf-425d-84de-095ff5b5674e","Type":"ContainerStarted","Data":"fcb384a83aa1d5abccb79b812da9f9970f9d6c7a42dff498a5c37188d68a391c"} Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.007926 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-jrsxw" Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.014494 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-jrsxw" event={"ID":"7817603d-cfdf-425d-84de-095ff5b5674e","Type":"ContainerDied","Data":"fcb384a83aa1d5abccb79b812da9f9970f9d6c7a42dff498a5c37188d68a391c"} Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.014557 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcb384a83aa1d5abccb79b812da9f9970f9d6c7a42dff498a5c37188d68a391c" Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.014637 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-jrsxw" Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.031728 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-xj7gh" event={"ID":"67126a3c-ec10-4f12-96ad-0133fcabb75f","Type":"ContainerDied","Data":"91733fbcf60df505571b2c97a8bd3196b5cbb14ee29792a954ce43f57d875c2a"} Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.032004 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91733fbcf60df505571b2c97a8bd3196b5cbb14ee29792a954ce43f57d875c2a" Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.038918 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6kgwt" event={"ID":"96c287a2-30d6-4055-aea8-d104dbd472c2","Type":"ContainerDied","Data":"2d939a5991230498c4fb3e3f03726eedd68a162dd545ddd264fa2cfae0ee8155"} Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.038960 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d939a5991230498c4fb3e3f03726eedd68a162dd545ddd264fa2cfae0ee8155" Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.040664 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5ac7-account-create-jmn9n" event={"ID":"ddb8d79c-fba7-489b-8953-f507deb03a03","Type":"ContainerDied","Data":"00b90b9adaa6420f2ec3d998c68f71f5baf55b757f2905875fcddc3a9317eca4"} Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.040710 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00b90b9adaa6420f2ec3d998c68f71f5baf55b757f2905875fcddc3a9317eca4" Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.044692 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c223-account-create-gfsvr" event={"ID":"ece646f4-4f02-4a0d-9dde-ffb3d913410e","Type":"ContainerDied","Data":"e554b943d6baeacdd6da901d3f233cb64b6703f11b21c13e20b32c7b3adc72cc"} Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.044770 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e554b943d6baeacdd6da901d3f233cb64b6703f11b21c13e20b32c7b3adc72cc" Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.046829 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-919f-account-create-zkhx4" event={"ID":"479c04e1-21fc-4674-98ac-3abc9ba96b34","Type":"ContainerDied","Data":"46e467a9d337e1b1a658a5f245635fd6530ed8ddf9bd2d710ee8056bca7204f0"} Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.046865 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46e467a9d337e1b1a658a5f245635fd6530ed8ddf9bd2d710ee8056bca7204f0" Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.070125 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xj7gh" Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.096992 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2m5q4\" (UniqueName: \"kubernetes.io/projected/7817603d-cfdf-425d-84de-095ff5b5674e-kube-api-access-2m5q4\") pod \"7817603d-cfdf-425d-84de-095ff5b5674e\" (UID: \"7817603d-cfdf-425d-84de-095ff5b5674e\") " Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.097057 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7817603d-cfdf-425d-84de-095ff5b5674e-operator-scripts\") pod \"7817603d-cfdf-425d-84de-095ff5b5674e\" (UID: \"7817603d-cfdf-425d-84de-095ff5b5674e\") " Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.098206 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7817603d-cfdf-425d-84de-095ff5b5674e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7817603d-cfdf-425d-84de-095ff5b5674e" (UID: "7817603d-cfdf-425d-84de-095ff5b5674e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.101567 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7817603d-cfdf-425d-84de-095ff5b5674e-kube-api-access-2m5q4" (OuterVolumeSpecName: "kube-api-access-2m5q4") pod "7817603d-cfdf-425d-84de-095ff5b5674e" (UID: "7817603d-cfdf-425d-84de-095ff5b5674e"). InnerVolumeSpecName "kube-api-access-2m5q4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.155264 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5ac7-account-create-jmn9n" Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.161630 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-919f-account-create-zkhx4" Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.198090 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ct92\" (UniqueName: \"kubernetes.io/projected/ddb8d79c-fba7-489b-8953-f507deb03a03-kube-api-access-7ct92\") pod \"ddb8d79c-fba7-489b-8953-f507deb03a03\" (UID: \"ddb8d79c-fba7-489b-8953-f507deb03a03\") " Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.198147 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqrk2\" (UniqueName: \"kubernetes.io/projected/67126a3c-ec10-4f12-96ad-0133fcabb75f-kube-api-access-vqrk2\") pod \"67126a3c-ec10-4f12-96ad-0133fcabb75f\" (UID: \"67126a3c-ec10-4f12-96ad-0133fcabb75f\") " Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.198271 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddb8d79c-fba7-489b-8953-f507deb03a03-operator-scripts\") pod \"ddb8d79c-fba7-489b-8953-f507deb03a03\" (UID: \"ddb8d79c-fba7-489b-8953-f507deb03a03\") " Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.198334 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67126a3c-ec10-4f12-96ad-0133fcabb75f-operator-scripts\") pod \"67126a3c-ec10-4f12-96ad-0133fcabb75f\" (UID: \"67126a3c-ec10-4f12-96ad-0133fcabb75f\") " Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.198752 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2m5q4\" (UniqueName: \"kubernetes.io/projected/7817603d-cfdf-425d-84de-095ff5b5674e-kube-api-access-2m5q4\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.198772 4699 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7817603d-cfdf-425d-84de-095ff5b5674e-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.200215 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67126a3c-ec10-4f12-96ad-0133fcabb75f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "67126a3c-ec10-4f12-96ad-0133fcabb75f" (UID: "67126a3c-ec10-4f12-96ad-0133fcabb75f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.201347 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddb8d79c-fba7-489b-8953-f507deb03a03-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ddb8d79c-fba7-489b-8953-f507deb03a03" (UID: "ddb8d79c-fba7-489b-8953-f507deb03a03"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.202238 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6kgwt" Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.205150 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddb8d79c-fba7-489b-8953-f507deb03a03-kube-api-access-7ct92" (OuterVolumeSpecName: "kube-api-access-7ct92") pod "ddb8d79c-fba7-489b-8953-f507deb03a03" (UID: "ddb8d79c-fba7-489b-8953-f507deb03a03"). InnerVolumeSpecName "kube-api-access-7ct92". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.205608 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67126a3c-ec10-4f12-96ad-0133fcabb75f-kube-api-access-vqrk2" (OuterVolumeSpecName: "kube-api-access-vqrk2") pod "67126a3c-ec10-4f12-96ad-0133fcabb75f" (UID: "67126a3c-ec10-4f12-96ad-0133fcabb75f"). InnerVolumeSpecName "kube-api-access-vqrk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.226574 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c223-account-create-gfsvr" Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.300321 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8bcs\" (UniqueName: \"kubernetes.io/projected/ece646f4-4f02-4a0d-9dde-ffb3d913410e-kube-api-access-m8bcs\") pod \"ece646f4-4f02-4a0d-9dde-ffb3d913410e\" (UID: \"ece646f4-4f02-4a0d-9dde-ffb3d913410e\") " Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.300460 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/479c04e1-21fc-4674-98ac-3abc9ba96b34-operator-scripts\") pod \"479c04e1-21fc-4674-98ac-3abc9ba96b34\" (UID: \"479c04e1-21fc-4674-98ac-3abc9ba96b34\") " Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.300841 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/479c04e1-21fc-4674-98ac-3abc9ba96b34-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "479c04e1-21fc-4674-98ac-3abc9ba96b34" (UID: "479c04e1-21fc-4674-98ac-3abc9ba96b34"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.300958 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ece646f4-4f02-4a0d-9dde-ffb3d913410e-operator-scripts\") pod \"ece646f4-4f02-4a0d-9dde-ffb3d913410e\" (UID: \"ece646f4-4f02-4a0d-9dde-ffb3d913410e\") " Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.301318 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ece646f4-4f02-4a0d-9dde-ffb3d913410e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ece646f4-4f02-4a0d-9dde-ffb3d913410e" (UID: "ece646f4-4f02-4a0d-9dde-ffb3d913410e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.301766 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96c287a2-30d6-4055-aea8-d104dbd472c2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "96c287a2-30d6-4055-aea8-d104dbd472c2" (UID: "96c287a2-30d6-4055-aea8-d104dbd472c2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.301425 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96c287a2-30d6-4055-aea8-d104dbd472c2-operator-scripts\") pod \"96c287a2-30d6-4055-aea8-d104dbd472c2\" (UID: \"96c287a2-30d6-4055-aea8-d104dbd472c2\") " Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.301833 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znz52\" (UniqueName: \"kubernetes.io/projected/479c04e1-21fc-4674-98ac-3abc9ba96b34-kube-api-access-znz52\") pod \"479c04e1-21fc-4674-98ac-3abc9ba96b34\" (UID: \"479c04e1-21fc-4674-98ac-3abc9ba96b34\") " Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.302107 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7h2sw\" (UniqueName: \"kubernetes.io/projected/96c287a2-30d6-4055-aea8-d104dbd472c2-kube-api-access-7h2sw\") pod \"96c287a2-30d6-4055-aea8-d104dbd472c2\" (UID: \"96c287a2-30d6-4055-aea8-d104dbd472c2\") " Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.302513 4699 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67126a3c-ec10-4f12-96ad-0133fcabb75f-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.302536 4699 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ece646f4-4f02-4a0d-9dde-ffb3d913410e-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.302549 4699 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96c287a2-30d6-4055-aea8-d104dbd472c2-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.302561 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ct92\" (UniqueName: \"kubernetes.io/projected/ddb8d79c-fba7-489b-8953-f507deb03a03-kube-api-access-7ct92\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.302591 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqrk2\" (UniqueName: \"kubernetes.io/projected/67126a3c-ec10-4f12-96ad-0133fcabb75f-kube-api-access-vqrk2\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.302601 4699 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/479c04e1-21fc-4674-98ac-3abc9ba96b34-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.302613 4699 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddb8d79c-fba7-489b-8953-f507deb03a03-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.330759 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96c287a2-30d6-4055-aea8-d104dbd472c2-kube-api-access-7h2sw" (OuterVolumeSpecName: "kube-api-access-7h2sw") pod "96c287a2-30d6-4055-aea8-d104dbd472c2" (UID: "96c287a2-30d6-4055-aea8-d104dbd472c2"). InnerVolumeSpecName "kube-api-access-7h2sw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.330828 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ece646f4-4f02-4a0d-9dde-ffb3d913410e-kube-api-access-m8bcs" (OuterVolumeSpecName: "kube-api-access-m8bcs") pod "ece646f4-4f02-4a0d-9dde-ffb3d913410e" (UID: "ece646f4-4f02-4a0d-9dde-ffb3d913410e"). InnerVolumeSpecName "kube-api-access-m8bcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.331876 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/479c04e1-21fc-4674-98ac-3abc9ba96b34-kube-api-access-znz52" (OuterVolumeSpecName: "kube-api-access-znz52") pod "479c04e1-21fc-4674-98ac-3abc9ba96b34" (UID: "479c04e1-21fc-4674-98ac-3abc9ba96b34"). InnerVolumeSpecName "kube-api-access-znz52". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.404422 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znz52\" (UniqueName: \"kubernetes.io/projected/479c04e1-21fc-4674-98ac-3abc9ba96b34-kube-api-access-znz52\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.404488 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7h2sw\" (UniqueName: \"kubernetes.io/projected/96c287a2-30d6-4055-aea8-d104dbd472c2-kube-api-access-7h2sw\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.404502 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8bcs\" (UniqueName: \"kubernetes.io/projected/ece646f4-4f02-4a0d-9dde-ffb3d913410e-kube-api-access-m8bcs\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.532605 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74f6bcbc87-x5pmx" Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.591930 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-bdb7g"] Nov 22 04:26:20 crc kubenswrapper[4699]: I1122 04:26:20.592216 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-bdb7g" podUID="5150cdc8-40db-4421-bbcd-16213ce14b2e" containerName="dnsmasq-dns" containerID="cri-o://22cbc53f0e4539ab600ee81cc3fc691c24f4f65d650e4b1a822c0d12ea4098a1" gracePeriod=10 Nov 22 04:26:21 crc kubenswrapper[4699]: I1122 04:26:21.062017 4699 generic.go:334] "Generic (PLEG): container finished" podID="5150cdc8-40db-4421-bbcd-16213ce14b2e" containerID="22cbc53f0e4539ab600ee81cc3fc691c24f4f65d650e4b1a822c0d12ea4098a1" exitCode=0 Nov 22 04:26:21 crc kubenswrapper[4699]: I1122 04:26:21.062876 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-bdb7g" event={"ID":"5150cdc8-40db-4421-bbcd-16213ce14b2e","Type":"ContainerDied","Data":"22cbc53f0e4539ab600ee81cc3fc691c24f4f65d650e4b1a822c0d12ea4098a1"} Nov 22 04:26:21 crc kubenswrapper[4699]: I1122 04:26:21.062904 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-bdb7g" event={"ID":"5150cdc8-40db-4421-bbcd-16213ce14b2e","Type":"ContainerDied","Data":"cc40699197245008474c492eab670ed23d89b058441b1123e7f86b93c94138d6"} Nov 22 04:26:21 crc kubenswrapper[4699]: I1122 04:26:21.062915 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc40699197245008474c492eab670ed23d89b058441b1123e7f86b93c94138d6" Nov 22 04:26:21 crc kubenswrapper[4699]: I1122 04:26:21.066670 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-919f-account-create-zkhx4" Nov 22 04:26:21 crc kubenswrapper[4699]: I1122 04:26:21.066707 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6kgwt" Nov 22 04:26:21 crc kubenswrapper[4699]: I1122 04:26:21.066679 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xj7gh" Nov 22 04:26:21 crc kubenswrapper[4699]: I1122 04:26:21.066728 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8xlt7" event={"ID":"34007e0a-511e-41f1-b3fc-810d7911d11c","Type":"ContainerStarted","Data":"805f75393a4c3c6f8b4816ed5752ac5eff8ef1a971274d36c33b43ff56076633"} Nov 22 04:26:21 crc kubenswrapper[4699]: I1122 04:26:21.066680 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5ac7-account-create-jmn9n" Nov 22 04:26:21 crc kubenswrapper[4699]: I1122 04:26:21.066847 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c223-account-create-gfsvr" Nov 22 04:26:21 crc kubenswrapper[4699]: I1122 04:26:21.114055 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-8xlt7" podStartSLOduration=2.27141384 podStartE2EDuration="7.11401666s" podCreationTimestamp="2025-11-22 04:26:14 +0000 UTC" firstStartedPulling="2025-11-22 04:26:15.127674088 +0000 UTC m=+1126.470295275" lastFinishedPulling="2025-11-22 04:26:19.970276888 +0000 UTC m=+1131.312898095" observedRunningTime="2025-11-22 04:26:21.111595461 +0000 UTC m=+1132.454216678" watchObservedRunningTime="2025-11-22 04:26:21.11401666 +0000 UTC m=+1132.456637847" Nov 22 04:26:21 crc kubenswrapper[4699]: I1122 04:26:21.331697 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-bdb7g" Nov 22 04:26:21 crc kubenswrapper[4699]: I1122 04:26:21.422981 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5150cdc8-40db-4421-bbcd-16213ce14b2e-ovsdbserver-nb\") pod \"5150cdc8-40db-4421-bbcd-16213ce14b2e\" (UID: \"5150cdc8-40db-4421-bbcd-16213ce14b2e\") " Nov 22 04:26:21 crc kubenswrapper[4699]: I1122 04:26:21.423045 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgmz8\" (UniqueName: \"kubernetes.io/projected/5150cdc8-40db-4421-bbcd-16213ce14b2e-kube-api-access-vgmz8\") pod \"5150cdc8-40db-4421-bbcd-16213ce14b2e\" (UID: \"5150cdc8-40db-4421-bbcd-16213ce14b2e\") " Nov 22 04:26:21 crc kubenswrapper[4699]: I1122 04:26:21.423094 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5150cdc8-40db-4421-bbcd-16213ce14b2e-dns-svc\") pod \"5150cdc8-40db-4421-bbcd-16213ce14b2e\" (UID: \"5150cdc8-40db-4421-bbcd-16213ce14b2e\") " Nov 22 04:26:21 crc kubenswrapper[4699]: I1122 04:26:21.423173 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5150cdc8-40db-4421-bbcd-16213ce14b2e-ovsdbserver-sb\") pod \"5150cdc8-40db-4421-bbcd-16213ce14b2e\" (UID: \"5150cdc8-40db-4421-bbcd-16213ce14b2e\") " Nov 22 04:26:21 crc kubenswrapper[4699]: I1122 04:26:21.423196 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5150cdc8-40db-4421-bbcd-16213ce14b2e-config\") pod \"5150cdc8-40db-4421-bbcd-16213ce14b2e\" (UID: \"5150cdc8-40db-4421-bbcd-16213ce14b2e\") " Nov 22 04:26:21 crc kubenswrapper[4699]: I1122 04:26:21.430051 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5150cdc8-40db-4421-bbcd-16213ce14b2e-kube-api-access-vgmz8" (OuterVolumeSpecName: "kube-api-access-vgmz8") pod "5150cdc8-40db-4421-bbcd-16213ce14b2e" (UID: "5150cdc8-40db-4421-bbcd-16213ce14b2e"). InnerVolumeSpecName "kube-api-access-vgmz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:26:21 crc kubenswrapper[4699]: I1122 04:26:21.469643 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5150cdc8-40db-4421-bbcd-16213ce14b2e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5150cdc8-40db-4421-bbcd-16213ce14b2e" (UID: "5150cdc8-40db-4421-bbcd-16213ce14b2e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:26:21 crc kubenswrapper[4699]: I1122 04:26:21.471500 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5150cdc8-40db-4421-bbcd-16213ce14b2e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5150cdc8-40db-4421-bbcd-16213ce14b2e" (UID: "5150cdc8-40db-4421-bbcd-16213ce14b2e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:26:21 crc kubenswrapper[4699]: I1122 04:26:21.473609 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5150cdc8-40db-4421-bbcd-16213ce14b2e-config" (OuterVolumeSpecName: "config") pod "5150cdc8-40db-4421-bbcd-16213ce14b2e" (UID: "5150cdc8-40db-4421-bbcd-16213ce14b2e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:26:21 crc kubenswrapper[4699]: I1122 04:26:21.494167 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5150cdc8-40db-4421-bbcd-16213ce14b2e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5150cdc8-40db-4421-bbcd-16213ce14b2e" (UID: "5150cdc8-40db-4421-bbcd-16213ce14b2e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:26:21 crc kubenswrapper[4699]: I1122 04:26:21.525463 4699 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5150cdc8-40db-4421-bbcd-16213ce14b2e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:21 crc kubenswrapper[4699]: I1122 04:26:21.525512 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgmz8\" (UniqueName: \"kubernetes.io/projected/5150cdc8-40db-4421-bbcd-16213ce14b2e-kube-api-access-vgmz8\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:21 crc kubenswrapper[4699]: I1122 04:26:21.525523 4699 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5150cdc8-40db-4421-bbcd-16213ce14b2e-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:21 crc kubenswrapper[4699]: I1122 04:26:21.525532 4699 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5150cdc8-40db-4421-bbcd-16213ce14b2e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:21 crc kubenswrapper[4699]: I1122 04:26:21.525541 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5150cdc8-40db-4421-bbcd-16213ce14b2e-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:22 crc kubenswrapper[4699]: I1122 04:26:22.073285 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-bdb7g" Nov 22 04:26:22 crc kubenswrapper[4699]: I1122 04:26:22.107143 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-bdb7g"] Nov 22 04:26:22 crc kubenswrapper[4699]: I1122 04:26:22.114395 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-bdb7g"] Nov 22 04:26:23 crc kubenswrapper[4699]: I1122 04:26:23.459291 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5150cdc8-40db-4421-bbcd-16213ce14b2e" path="/var/lib/kubelet/pods/5150cdc8-40db-4421-bbcd-16213ce14b2e/volumes" Nov 22 04:26:31 crc kubenswrapper[4699]: I1122 04:26:31.150475 4699 generic.go:334] "Generic (PLEG): container finished" podID="34007e0a-511e-41f1-b3fc-810d7911d11c" containerID="805f75393a4c3c6f8b4816ed5752ac5eff8ef1a971274d36c33b43ff56076633" exitCode=0 Nov 22 04:26:31 crc kubenswrapper[4699]: I1122 04:26:31.150530 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8xlt7" event={"ID":"34007e0a-511e-41f1-b3fc-810d7911d11c","Type":"ContainerDied","Data":"805f75393a4c3c6f8b4816ed5752ac5eff8ef1a971274d36c33b43ff56076633"} Nov 22 04:26:32 crc kubenswrapper[4699]: I1122 04:26:32.464771 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8xlt7" Nov 22 04:26:32 crc kubenswrapper[4699]: I1122 04:26:32.499609 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fm45j\" (UniqueName: \"kubernetes.io/projected/34007e0a-511e-41f1-b3fc-810d7911d11c-kube-api-access-fm45j\") pod \"34007e0a-511e-41f1-b3fc-810d7911d11c\" (UID: \"34007e0a-511e-41f1-b3fc-810d7911d11c\") " Nov 22 04:26:32 crc kubenswrapper[4699]: I1122 04:26:32.499741 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34007e0a-511e-41f1-b3fc-810d7911d11c-combined-ca-bundle\") pod \"34007e0a-511e-41f1-b3fc-810d7911d11c\" (UID: \"34007e0a-511e-41f1-b3fc-810d7911d11c\") " Nov 22 04:26:32 crc kubenswrapper[4699]: I1122 04:26:32.499832 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34007e0a-511e-41f1-b3fc-810d7911d11c-config-data\") pod \"34007e0a-511e-41f1-b3fc-810d7911d11c\" (UID: \"34007e0a-511e-41f1-b3fc-810d7911d11c\") " Nov 22 04:26:32 crc kubenswrapper[4699]: I1122 04:26:32.507760 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34007e0a-511e-41f1-b3fc-810d7911d11c-kube-api-access-fm45j" (OuterVolumeSpecName: "kube-api-access-fm45j") pod "34007e0a-511e-41f1-b3fc-810d7911d11c" (UID: "34007e0a-511e-41f1-b3fc-810d7911d11c"). InnerVolumeSpecName "kube-api-access-fm45j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:26:32 crc kubenswrapper[4699]: I1122 04:26:32.523137 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34007e0a-511e-41f1-b3fc-810d7911d11c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34007e0a-511e-41f1-b3fc-810d7911d11c" (UID: "34007e0a-511e-41f1-b3fc-810d7911d11c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:26:32 crc kubenswrapper[4699]: I1122 04:26:32.546070 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34007e0a-511e-41f1-b3fc-810d7911d11c-config-data" (OuterVolumeSpecName: "config-data") pod "34007e0a-511e-41f1-b3fc-810d7911d11c" (UID: "34007e0a-511e-41f1-b3fc-810d7911d11c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:26:32 crc kubenswrapper[4699]: I1122 04:26:32.601258 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34007e0a-511e-41f1-b3fc-810d7911d11c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:32 crc kubenswrapper[4699]: I1122 04:26:32.601297 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34007e0a-511e-41f1-b3fc-810d7911d11c-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:32 crc kubenswrapper[4699]: I1122 04:26:32.601306 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fm45j\" (UniqueName: \"kubernetes.io/projected/34007e0a-511e-41f1-b3fc-810d7911d11c-kube-api-access-fm45j\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.168712 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8xlt7" event={"ID":"34007e0a-511e-41f1-b3fc-810d7911d11c","Type":"ContainerDied","Data":"ae3a5badede071d3425c341cd60bd78f5f81ef9c33bdafda809e2dc481d33efc"} Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.168752 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae3a5badede071d3425c341cd60bd78f5f81ef9c33bdafda809e2dc481d33efc" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.168818 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8xlt7" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.424820 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-2zpw6"] Nov 22 04:26:33 crc kubenswrapper[4699]: E1122 04:26:33.425486 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7817603d-cfdf-425d-84de-095ff5b5674e" containerName="mariadb-database-create" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.425500 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="7817603d-cfdf-425d-84de-095ff5b5674e" containerName="mariadb-database-create" Nov 22 04:26:33 crc kubenswrapper[4699]: E1122 04:26:33.425517 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5150cdc8-40db-4421-bbcd-16213ce14b2e" containerName="dnsmasq-dns" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.425525 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="5150cdc8-40db-4421-bbcd-16213ce14b2e" containerName="dnsmasq-dns" Nov 22 04:26:33 crc kubenswrapper[4699]: E1122 04:26:33.425535 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ece646f4-4f02-4a0d-9dde-ffb3d913410e" containerName="mariadb-account-create" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.425541 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="ece646f4-4f02-4a0d-9dde-ffb3d913410e" containerName="mariadb-account-create" Nov 22 04:26:33 crc kubenswrapper[4699]: E1122 04:26:33.425551 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67126a3c-ec10-4f12-96ad-0133fcabb75f" containerName="mariadb-database-create" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.425557 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="67126a3c-ec10-4f12-96ad-0133fcabb75f" containerName="mariadb-database-create" Nov 22 04:26:33 crc kubenswrapper[4699]: E1122 04:26:33.425566 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddb8d79c-fba7-489b-8953-f507deb03a03" containerName="mariadb-account-create" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.425572 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddb8d79c-fba7-489b-8953-f507deb03a03" containerName="mariadb-account-create" Nov 22 04:26:33 crc kubenswrapper[4699]: E1122 04:26:33.425585 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34007e0a-511e-41f1-b3fc-810d7911d11c" containerName="keystone-db-sync" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.425601 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="34007e0a-511e-41f1-b3fc-810d7911d11c" containerName="keystone-db-sync" Nov 22 04:26:33 crc kubenswrapper[4699]: E1122 04:26:33.425610 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96c287a2-30d6-4055-aea8-d104dbd472c2" containerName="mariadb-database-create" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.425616 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="96c287a2-30d6-4055-aea8-d104dbd472c2" containerName="mariadb-database-create" Nov 22 04:26:33 crc kubenswrapper[4699]: E1122 04:26:33.425629 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5150cdc8-40db-4421-bbcd-16213ce14b2e" containerName="init" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.425636 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="5150cdc8-40db-4421-bbcd-16213ce14b2e" containerName="init" Nov 22 04:26:33 crc kubenswrapper[4699]: E1122 04:26:33.425646 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="479c04e1-21fc-4674-98ac-3abc9ba96b34" containerName="mariadb-account-create" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.425652 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="479c04e1-21fc-4674-98ac-3abc9ba96b34" containerName="mariadb-account-create" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.425817 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="34007e0a-511e-41f1-b3fc-810d7911d11c" containerName="keystone-db-sync" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.425830 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="ece646f4-4f02-4a0d-9dde-ffb3d913410e" containerName="mariadb-account-create" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.425838 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddb8d79c-fba7-489b-8953-f507deb03a03" containerName="mariadb-account-create" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.425845 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="7817603d-cfdf-425d-84de-095ff5b5674e" containerName="mariadb-database-create" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.425857 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="96c287a2-30d6-4055-aea8-d104dbd472c2" containerName="mariadb-database-create" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.425867 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="5150cdc8-40db-4421-bbcd-16213ce14b2e" containerName="dnsmasq-dns" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.425879 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="479c04e1-21fc-4674-98ac-3abc9ba96b34" containerName="mariadb-account-create" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.425898 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="67126a3c-ec10-4f12-96ad-0133fcabb75f" containerName="mariadb-database-create" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.426771 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-2zpw6" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.442417 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-2zpw6"] Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.477282 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-snnx9"] Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.478302 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-snnx9" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.494397 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-qfldf" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.494511 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.494571 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.494727 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.494735 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.517276 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cfccecc-0040-47e6-ad18-c905d4a5e097-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-2zpw6\" (UID: \"4cfccecc-0040-47e6-ad18-c905d4a5e097\") " pod="openstack/dnsmasq-dns-847c4cc679-2zpw6" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.517326 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czcxb\" (UniqueName: \"kubernetes.io/projected/f186fd49-86df-4a93-8c3d-d068660fe92b-kube-api-access-czcxb\") pod \"keystone-bootstrap-snnx9\" (UID: \"f186fd49-86df-4a93-8c3d-d068660fe92b\") " pod="openstack/keystone-bootstrap-snnx9" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.517542 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f186fd49-86df-4a93-8c3d-d068660fe92b-scripts\") pod \"keystone-bootstrap-snnx9\" (UID: \"f186fd49-86df-4a93-8c3d-d068660fe92b\") " pod="openstack/keystone-bootstrap-snnx9" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.517583 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f186fd49-86df-4a93-8c3d-d068660fe92b-credential-keys\") pod \"keystone-bootstrap-snnx9\" (UID: \"f186fd49-86df-4a93-8c3d-d068660fe92b\") " pod="openstack/keystone-bootstrap-snnx9" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.517711 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f186fd49-86df-4a93-8c3d-d068660fe92b-combined-ca-bundle\") pod \"keystone-bootstrap-snnx9\" (UID: \"f186fd49-86df-4a93-8c3d-d068660fe92b\") " pod="openstack/keystone-bootstrap-snnx9" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.517829 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4cfccecc-0040-47e6-ad18-c905d4a5e097-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-2zpw6\" (UID: \"4cfccecc-0040-47e6-ad18-c905d4a5e097\") " pod="openstack/dnsmasq-dns-847c4cc679-2zpw6" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.517892 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f186fd49-86df-4a93-8c3d-d068660fe92b-config-data\") pod \"keystone-bootstrap-snnx9\" (UID: \"f186fd49-86df-4a93-8c3d-d068660fe92b\") " pod="openstack/keystone-bootstrap-snnx9" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.517985 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cfccecc-0040-47e6-ad18-c905d4a5e097-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-2zpw6\" (UID: \"4cfccecc-0040-47e6-ad18-c905d4a5e097\") " pod="openstack/dnsmasq-dns-847c4cc679-2zpw6" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.518010 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cfccecc-0040-47e6-ad18-c905d4a5e097-dns-svc\") pod \"dnsmasq-dns-847c4cc679-2zpw6\" (UID: \"4cfccecc-0040-47e6-ad18-c905d4a5e097\") " pod="openstack/dnsmasq-dns-847c4cc679-2zpw6" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.518040 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f186fd49-86df-4a93-8c3d-d068660fe92b-fernet-keys\") pod \"keystone-bootstrap-snnx9\" (UID: \"f186fd49-86df-4a93-8c3d-d068660fe92b\") " pod="openstack/keystone-bootstrap-snnx9" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.518103 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cfccecc-0040-47e6-ad18-c905d4a5e097-config\") pod \"dnsmasq-dns-847c4cc679-2zpw6\" (UID: \"4cfccecc-0040-47e6-ad18-c905d4a5e097\") " pod="openstack/dnsmasq-dns-847c4cc679-2zpw6" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.518187 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpctd\" (UniqueName: \"kubernetes.io/projected/4cfccecc-0040-47e6-ad18-c905d4a5e097-kube-api-access-vpctd\") pod \"dnsmasq-dns-847c4cc679-2zpw6\" (UID: \"4cfccecc-0040-47e6-ad18-c905d4a5e097\") " pod="openstack/dnsmasq-dns-847c4cc679-2zpw6" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.531419 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-snnx9"] Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.619351 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czcxb\" (UniqueName: \"kubernetes.io/projected/f186fd49-86df-4a93-8c3d-d068660fe92b-kube-api-access-czcxb\") pod \"keystone-bootstrap-snnx9\" (UID: \"f186fd49-86df-4a93-8c3d-d068660fe92b\") " pod="openstack/keystone-bootstrap-snnx9" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.619412 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f186fd49-86df-4a93-8c3d-d068660fe92b-scripts\") pod \"keystone-bootstrap-snnx9\" (UID: \"f186fd49-86df-4a93-8c3d-d068660fe92b\") " pod="openstack/keystone-bootstrap-snnx9" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.619458 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f186fd49-86df-4a93-8c3d-d068660fe92b-credential-keys\") pod \"keystone-bootstrap-snnx9\" (UID: \"f186fd49-86df-4a93-8c3d-d068660fe92b\") " pod="openstack/keystone-bootstrap-snnx9" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.619499 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f186fd49-86df-4a93-8c3d-d068660fe92b-combined-ca-bundle\") pod \"keystone-bootstrap-snnx9\" (UID: \"f186fd49-86df-4a93-8c3d-d068660fe92b\") " pod="openstack/keystone-bootstrap-snnx9" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.619548 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4cfccecc-0040-47e6-ad18-c905d4a5e097-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-2zpw6\" (UID: \"4cfccecc-0040-47e6-ad18-c905d4a5e097\") " pod="openstack/dnsmasq-dns-847c4cc679-2zpw6" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.619577 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f186fd49-86df-4a93-8c3d-d068660fe92b-config-data\") pod \"keystone-bootstrap-snnx9\" (UID: \"f186fd49-86df-4a93-8c3d-d068660fe92b\") " pod="openstack/keystone-bootstrap-snnx9" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.619621 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cfccecc-0040-47e6-ad18-c905d4a5e097-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-2zpw6\" (UID: \"4cfccecc-0040-47e6-ad18-c905d4a5e097\") " pod="openstack/dnsmasq-dns-847c4cc679-2zpw6" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.619644 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cfccecc-0040-47e6-ad18-c905d4a5e097-dns-svc\") pod \"dnsmasq-dns-847c4cc679-2zpw6\" (UID: \"4cfccecc-0040-47e6-ad18-c905d4a5e097\") " pod="openstack/dnsmasq-dns-847c4cc679-2zpw6" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.619669 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f186fd49-86df-4a93-8c3d-d068660fe92b-fernet-keys\") pod \"keystone-bootstrap-snnx9\" (UID: \"f186fd49-86df-4a93-8c3d-d068660fe92b\") " pod="openstack/keystone-bootstrap-snnx9" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.619706 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cfccecc-0040-47e6-ad18-c905d4a5e097-config\") pod \"dnsmasq-dns-847c4cc679-2zpw6\" (UID: \"4cfccecc-0040-47e6-ad18-c905d4a5e097\") " pod="openstack/dnsmasq-dns-847c4cc679-2zpw6" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.620148 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpctd\" (UniqueName: \"kubernetes.io/projected/4cfccecc-0040-47e6-ad18-c905d4a5e097-kube-api-access-vpctd\") pod \"dnsmasq-dns-847c4cc679-2zpw6\" (UID: \"4cfccecc-0040-47e6-ad18-c905d4a5e097\") " pod="openstack/dnsmasq-dns-847c4cc679-2zpw6" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.620229 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cfccecc-0040-47e6-ad18-c905d4a5e097-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-2zpw6\" (UID: \"4cfccecc-0040-47e6-ad18-c905d4a5e097\") " pod="openstack/dnsmasq-dns-847c4cc679-2zpw6" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.621263 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cfccecc-0040-47e6-ad18-c905d4a5e097-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-2zpw6\" (UID: \"4cfccecc-0040-47e6-ad18-c905d4a5e097\") " pod="openstack/dnsmasq-dns-847c4cc679-2zpw6" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.621852 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cfccecc-0040-47e6-ad18-c905d4a5e097-config\") pod \"dnsmasq-dns-847c4cc679-2zpw6\" (UID: \"4cfccecc-0040-47e6-ad18-c905d4a5e097\") " pod="openstack/dnsmasq-dns-847c4cc679-2zpw6" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.621854 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4cfccecc-0040-47e6-ad18-c905d4a5e097-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-2zpw6\" (UID: \"4cfccecc-0040-47e6-ad18-c905d4a5e097\") " pod="openstack/dnsmasq-dns-847c4cc679-2zpw6" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.622582 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cfccecc-0040-47e6-ad18-c905d4a5e097-dns-svc\") pod \"dnsmasq-dns-847c4cc679-2zpw6\" (UID: \"4cfccecc-0040-47e6-ad18-c905d4a5e097\") " pod="openstack/dnsmasq-dns-847c4cc679-2zpw6" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.622695 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cfccecc-0040-47e6-ad18-c905d4a5e097-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-2zpw6\" (UID: \"4cfccecc-0040-47e6-ad18-c905d4a5e097\") " pod="openstack/dnsmasq-dns-847c4cc679-2zpw6" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.630711 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f186fd49-86df-4a93-8c3d-d068660fe92b-fernet-keys\") pod \"keystone-bootstrap-snnx9\" (UID: \"f186fd49-86df-4a93-8c3d-d068660fe92b\") " pod="openstack/keystone-bootstrap-snnx9" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.636603 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f186fd49-86df-4a93-8c3d-d068660fe92b-combined-ca-bundle\") pod \"keystone-bootstrap-snnx9\" (UID: \"f186fd49-86df-4a93-8c3d-d068660fe92b\") " pod="openstack/keystone-bootstrap-snnx9" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.639796 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-db-create-plg9d"] Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.641079 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-plg9d" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.646978 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f186fd49-86df-4a93-8c3d-d068660fe92b-credential-keys\") pod \"keystone-bootstrap-snnx9\" (UID: \"f186fd49-86df-4a93-8c3d-d068660fe92b\") " pod="openstack/keystone-bootstrap-snnx9" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.647484 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f186fd49-86df-4a93-8c3d-d068660fe92b-scripts\") pod \"keystone-bootstrap-snnx9\" (UID: \"f186fd49-86df-4a93-8c3d-d068660fe92b\") " pod="openstack/keystone-bootstrap-snnx9" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.649191 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f186fd49-86df-4a93-8c3d-d068660fe92b-config-data\") pod \"keystone-bootstrap-snnx9\" (UID: \"f186fd49-86df-4a93-8c3d-d068660fe92b\") " pod="openstack/keystone-bootstrap-snnx9" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.662670 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czcxb\" (UniqueName: \"kubernetes.io/projected/f186fd49-86df-4a93-8c3d-d068660fe92b-kube-api-access-czcxb\") pod \"keystone-bootstrap-snnx9\" (UID: \"f186fd49-86df-4a93-8c3d-d068660fe92b\") " pod="openstack/keystone-bootstrap-snnx9" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.670503 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-create-plg9d"] Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.683040 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpctd\" (UniqueName: \"kubernetes.io/projected/4cfccecc-0040-47e6-ad18-c905d4a5e097-kube-api-access-vpctd\") pod \"dnsmasq-dns-847c4cc679-2zpw6\" (UID: \"4cfccecc-0040-47e6-ad18-c905d4a5e097\") " pod="openstack/dnsmasq-dns-847c4cc679-2zpw6" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.687330 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-afdd-account-create-6x2nj"] Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.695992 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-afdd-account-create-6x2nj" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.710934 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-db-secret" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.722175 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0cb04579-e9a4-4139-8b94-4ce96f466397-operator-scripts\") pod \"ironic-db-create-plg9d\" (UID: \"0cb04579-e9a4-4139-8b94-4ce96f466397\") " pod="openstack/ironic-db-create-plg9d" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.722553 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km2n4\" (UniqueName: \"kubernetes.io/projected/0cb04579-e9a4-4139-8b94-4ce96f466397-kube-api-access-km2n4\") pod \"ironic-db-create-plg9d\" (UID: \"0cb04579-e9a4-4139-8b94-4ce96f466397\") " pod="openstack/ironic-db-create-plg9d" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.750484 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-2zpw6" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.751677 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-afdd-account-create-6x2nj"] Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.827142 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-snnx9" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.829727 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0cb04579-e9a4-4139-8b94-4ce96f466397-operator-scripts\") pod \"ironic-db-create-plg9d\" (UID: \"0cb04579-e9a4-4139-8b94-4ce96f466397\") " pod="openstack/ironic-db-create-plg9d" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.829810 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x24z8\" (UniqueName: \"kubernetes.io/projected/4040846b-26da-4123-b778-0115b8c5e6da-kube-api-access-x24z8\") pod \"ironic-afdd-account-create-6x2nj\" (UID: \"4040846b-26da-4123-b778-0115b8c5e6da\") " pod="openstack/ironic-afdd-account-create-6x2nj" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.829976 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km2n4\" (UniqueName: \"kubernetes.io/projected/0cb04579-e9a4-4139-8b94-4ce96f466397-kube-api-access-km2n4\") pod \"ironic-db-create-plg9d\" (UID: \"0cb04579-e9a4-4139-8b94-4ce96f466397\") " pod="openstack/ironic-db-create-plg9d" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.830093 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4040846b-26da-4123-b778-0115b8c5e6da-operator-scripts\") pod \"ironic-afdd-account-create-6x2nj\" (UID: \"4040846b-26da-4123-b778-0115b8c5e6da\") " pod="openstack/ironic-afdd-account-create-6x2nj" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.831418 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0cb04579-e9a4-4139-8b94-4ce96f466397-operator-scripts\") pod \"ironic-db-create-plg9d\" (UID: \"0cb04579-e9a4-4139-8b94-4ce96f466397\") " pod="openstack/ironic-db-create-plg9d" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.870624 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km2n4\" (UniqueName: \"kubernetes.io/projected/0cb04579-e9a4-4139-8b94-4ce96f466397-kube-api-access-km2n4\") pod \"ironic-db-create-plg9d\" (UID: \"0cb04579-e9a4-4139-8b94-4ce96f466397\") " pod="openstack/ironic-db-create-plg9d" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.870852 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-fgx2c"] Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.879962 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fgx2c" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.881769 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-plg9d" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.895531 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-fgx2c"] Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.895589 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.897528 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.905063 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-s2lsl" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.905291 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.905413 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.905636 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.905834 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.907258 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.907292 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-mf25h"] Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.908169 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mf25h" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.921034 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.921058 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-gn46q" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.946369 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-2zpw6"] Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.948118 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x24z8\" (UniqueName: \"kubernetes.io/projected/4040846b-26da-4123-b778-0115b8c5e6da-kube-api-access-x24z8\") pod \"ironic-afdd-account-create-6x2nj\" (UID: \"4040846b-26da-4123-b778-0115b8c5e6da\") " pod="openstack/ironic-afdd-account-create-6x2nj" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.948219 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2442edb-5370-4fd9-af87-6cb17498cee6-combined-ca-bundle\") pod \"cinder-db-sync-fgx2c\" (UID: \"a2442edb-5370-4fd9-af87-6cb17498cee6\") " pod="openstack/cinder-db-sync-fgx2c" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.948260 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/058a0faa-f3ce-4c0e-b7f0-e3f915f5d887-config-data\") pod \"ceilometer-0\" (UID: \"058a0faa-f3ce-4c0e-b7f0-e3f915f5d887\") " pod="openstack/ceilometer-0" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.948280 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpbvs\" (UniqueName: \"kubernetes.io/projected/058a0faa-f3ce-4c0e-b7f0-e3f915f5d887-kube-api-access-qpbvs\") pod \"ceilometer-0\" (UID: \"058a0faa-f3ce-4c0e-b7f0-e3f915f5d887\") " pod="openstack/ceilometer-0" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.948309 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a79a788b-1b1c-45df-9c90-3c30d382691b-combined-ca-bundle\") pod \"barbican-db-sync-mf25h\" (UID: \"a79a788b-1b1c-45df-9c90-3c30d382691b\") " pod="openstack/barbican-db-sync-mf25h" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.948327 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a2442edb-5370-4fd9-af87-6cb17498cee6-etc-machine-id\") pod \"cinder-db-sync-fgx2c\" (UID: \"a2442edb-5370-4fd9-af87-6cb17498cee6\") " pod="openstack/cinder-db-sync-fgx2c" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.948389 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a2442edb-5370-4fd9-af87-6cb17498cee6-db-sync-config-data\") pod \"cinder-db-sync-fgx2c\" (UID: \"a2442edb-5370-4fd9-af87-6cb17498cee6\") " pod="openstack/cinder-db-sync-fgx2c" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.948424 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4040846b-26da-4123-b778-0115b8c5e6da-operator-scripts\") pod \"ironic-afdd-account-create-6x2nj\" (UID: \"4040846b-26da-4123-b778-0115b8c5e6da\") " pod="openstack/ironic-afdd-account-create-6x2nj" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.948469 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/058a0faa-f3ce-4c0e-b7f0-e3f915f5d887-log-httpd\") pod \"ceilometer-0\" (UID: \"058a0faa-f3ce-4c0e-b7f0-e3f915f5d887\") " pod="openstack/ceilometer-0" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.955507 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/058a0faa-f3ce-4c0e-b7f0-e3f915f5d887-run-httpd\") pod \"ceilometer-0\" (UID: \"058a0faa-f3ce-4c0e-b7f0-e3f915f5d887\") " pod="openstack/ceilometer-0" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.955558 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qljtl\" (UniqueName: \"kubernetes.io/projected/a79a788b-1b1c-45df-9c90-3c30d382691b-kube-api-access-qljtl\") pod \"barbican-db-sync-mf25h\" (UID: \"a79a788b-1b1c-45df-9c90-3c30d382691b\") " pod="openstack/barbican-db-sync-mf25h" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.955585 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/058a0faa-f3ce-4c0e-b7f0-e3f915f5d887-scripts\") pod \"ceilometer-0\" (UID: \"058a0faa-f3ce-4c0e-b7f0-e3f915f5d887\") " pod="openstack/ceilometer-0" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.955607 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/058a0faa-f3ce-4c0e-b7f0-e3f915f5d887-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"058a0faa-f3ce-4c0e-b7f0-e3f915f5d887\") " pod="openstack/ceilometer-0" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.955661 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2442edb-5370-4fd9-af87-6cb17498cee6-config-data\") pod \"cinder-db-sync-fgx2c\" (UID: \"a2442edb-5370-4fd9-af87-6cb17498cee6\") " pod="openstack/cinder-db-sync-fgx2c" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.955683 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a79a788b-1b1c-45df-9c90-3c30d382691b-db-sync-config-data\") pod \"barbican-db-sync-mf25h\" (UID: \"a79a788b-1b1c-45df-9c90-3c30d382691b\") " pod="openstack/barbican-db-sync-mf25h" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.955701 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2442edb-5370-4fd9-af87-6cb17498cee6-scripts\") pod \"cinder-db-sync-fgx2c\" (UID: \"a2442edb-5370-4fd9-af87-6cb17498cee6\") " pod="openstack/cinder-db-sync-fgx2c" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.955781 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/058a0faa-f3ce-4c0e-b7f0-e3f915f5d887-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"058a0faa-f3ce-4c0e-b7f0-e3f915f5d887\") " pod="openstack/ceilometer-0" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.955796 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzdkb\" (UniqueName: \"kubernetes.io/projected/a2442edb-5370-4fd9-af87-6cb17498cee6-kube-api-access-xzdkb\") pod \"cinder-db-sync-fgx2c\" (UID: \"a2442edb-5370-4fd9-af87-6cb17498cee6\") " pod="openstack/cinder-db-sync-fgx2c" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.957597 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4040846b-26da-4123-b778-0115b8c5e6da-operator-scripts\") pod \"ironic-afdd-account-create-6x2nj\" (UID: \"4040846b-26da-4123-b778-0115b8c5e6da\") " pod="openstack/ironic-afdd-account-create-6x2nj" Nov 22 04:26:33 crc kubenswrapper[4699]: I1122 04:26:33.964589 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-mf25h"] Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.008190 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x24z8\" (UniqueName: \"kubernetes.io/projected/4040846b-26da-4123-b778-0115b8c5e6da-kube-api-access-x24z8\") pod \"ironic-afdd-account-create-6x2nj\" (UID: \"4040846b-26da-4123-b778-0115b8c5e6da\") " pod="openstack/ironic-afdd-account-create-6x2nj" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.009581 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-6dplx"] Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.011807 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-6dplx" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.058709 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/058a0faa-f3ce-4c0e-b7f0-e3f915f5d887-run-httpd\") pod \"ceilometer-0\" (UID: \"058a0faa-f3ce-4c0e-b7f0-e3f915f5d887\") " pod="openstack/ceilometer-0" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.058752 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qljtl\" (UniqueName: \"kubernetes.io/projected/a79a788b-1b1c-45df-9c90-3c30d382691b-kube-api-access-qljtl\") pod \"barbican-db-sync-mf25h\" (UID: \"a79a788b-1b1c-45df-9c90-3c30d382691b\") " pod="openstack/barbican-db-sync-mf25h" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.058774 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/058a0faa-f3ce-4c0e-b7f0-e3f915f5d887-scripts\") pod \"ceilometer-0\" (UID: \"058a0faa-f3ce-4c0e-b7f0-e3f915f5d887\") " pod="openstack/ceilometer-0" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.058814 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/058a0faa-f3ce-4c0e-b7f0-e3f915f5d887-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"058a0faa-f3ce-4c0e-b7f0-e3f915f5d887\") " pod="openstack/ceilometer-0" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.058845 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2442edb-5370-4fd9-af87-6cb17498cee6-config-data\") pod \"cinder-db-sync-fgx2c\" (UID: \"a2442edb-5370-4fd9-af87-6cb17498cee6\") " pod="openstack/cinder-db-sync-fgx2c" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.059971 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a79a788b-1b1c-45df-9c90-3c30d382691b-db-sync-config-data\") pod \"barbican-db-sync-mf25h\" (UID: \"a79a788b-1b1c-45df-9c90-3c30d382691b\") " pod="openstack/barbican-db-sync-mf25h" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.060033 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2442edb-5370-4fd9-af87-6cb17498cee6-scripts\") pod \"cinder-db-sync-fgx2c\" (UID: \"a2442edb-5370-4fd9-af87-6cb17498cee6\") " pod="openstack/cinder-db-sync-fgx2c" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.060097 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e2eecb1-f103-46a7-9f37-5d2259df0703-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-6dplx\" (UID: \"6e2eecb1-f103-46a7-9f37-5d2259df0703\") " pod="openstack/dnsmasq-dns-785d8bcb8c-6dplx" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.060122 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5cft\" (UniqueName: \"kubernetes.io/projected/6e2eecb1-f103-46a7-9f37-5d2259df0703-kube-api-access-d5cft\") pod \"dnsmasq-dns-785d8bcb8c-6dplx\" (UID: \"6e2eecb1-f103-46a7-9f37-5d2259df0703\") " pod="openstack/dnsmasq-dns-785d8bcb8c-6dplx" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.060154 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/058a0faa-f3ce-4c0e-b7f0-e3f915f5d887-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"058a0faa-f3ce-4c0e-b7f0-e3f915f5d887\") " pod="openstack/ceilometer-0" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.060171 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzdkb\" (UniqueName: \"kubernetes.io/projected/a2442edb-5370-4fd9-af87-6cb17498cee6-kube-api-access-xzdkb\") pod \"cinder-db-sync-fgx2c\" (UID: \"a2442edb-5370-4fd9-af87-6cb17498cee6\") " pod="openstack/cinder-db-sync-fgx2c" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.060246 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e2eecb1-f103-46a7-9f37-5d2259df0703-config\") pod \"dnsmasq-dns-785d8bcb8c-6dplx\" (UID: \"6e2eecb1-f103-46a7-9f37-5d2259df0703\") " pod="openstack/dnsmasq-dns-785d8bcb8c-6dplx" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.060481 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2442edb-5370-4fd9-af87-6cb17498cee6-combined-ca-bundle\") pod \"cinder-db-sync-fgx2c\" (UID: \"a2442edb-5370-4fd9-af87-6cb17498cee6\") " pod="openstack/cinder-db-sync-fgx2c" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.060558 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e2eecb1-f103-46a7-9f37-5d2259df0703-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-6dplx\" (UID: \"6e2eecb1-f103-46a7-9f37-5d2259df0703\") " pod="openstack/dnsmasq-dns-785d8bcb8c-6dplx" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.062058 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/058a0faa-f3ce-4c0e-b7f0-e3f915f5d887-config-data\") pod \"ceilometer-0\" (UID: \"058a0faa-f3ce-4c0e-b7f0-e3f915f5d887\") " pod="openstack/ceilometer-0" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.062092 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpbvs\" (UniqueName: \"kubernetes.io/projected/058a0faa-f3ce-4c0e-b7f0-e3f915f5d887-kube-api-access-qpbvs\") pod \"ceilometer-0\" (UID: \"058a0faa-f3ce-4c0e-b7f0-e3f915f5d887\") " pod="openstack/ceilometer-0" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.062135 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6e2eecb1-f103-46a7-9f37-5d2259df0703-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-6dplx\" (UID: \"6e2eecb1-f103-46a7-9f37-5d2259df0703\") " pod="openstack/dnsmasq-dns-785d8bcb8c-6dplx" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.062154 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a79a788b-1b1c-45df-9c90-3c30d382691b-combined-ca-bundle\") pod \"barbican-db-sync-mf25h\" (UID: \"a79a788b-1b1c-45df-9c90-3c30d382691b\") " pod="openstack/barbican-db-sync-mf25h" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.062171 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a2442edb-5370-4fd9-af87-6cb17498cee6-etc-machine-id\") pod \"cinder-db-sync-fgx2c\" (UID: \"a2442edb-5370-4fd9-af87-6cb17498cee6\") " pod="openstack/cinder-db-sync-fgx2c" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.062214 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e2eecb1-f103-46a7-9f37-5d2259df0703-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-6dplx\" (UID: \"6e2eecb1-f103-46a7-9f37-5d2259df0703\") " pod="openstack/dnsmasq-dns-785d8bcb8c-6dplx" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.062239 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a2442edb-5370-4fd9-af87-6cb17498cee6-db-sync-config-data\") pod \"cinder-db-sync-fgx2c\" (UID: \"a2442edb-5370-4fd9-af87-6cb17498cee6\") " pod="openstack/cinder-db-sync-fgx2c" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.062291 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/058a0faa-f3ce-4c0e-b7f0-e3f915f5d887-log-httpd\") pod \"ceilometer-0\" (UID: \"058a0faa-f3ce-4c0e-b7f0-e3f915f5d887\") " pod="openstack/ceilometer-0" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.063921 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/058a0faa-f3ce-4c0e-b7f0-e3f915f5d887-log-httpd\") pod \"ceilometer-0\" (UID: \"058a0faa-f3ce-4c0e-b7f0-e3f915f5d887\") " pod="openstack/ceilometer-0" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.065545 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/058a0faa-f3ce-4c0e-b7f0-e3f915f5d887-run-httpd\") pod \"ceilometer-0\" (UID: \"058a0faa-f3ce-4c0e-b7f0-e3f915f5d887\") " pod="openstack/ceilometer-0" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.065678 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-6dplx"] Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.067230 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a2442edb-5370-4fd9-af87-6cb17498cee6-etc-machine-id\") pod \"cinder-db-sync-fgx2c\" (UID: \"a2442edb-5370-4fd9-af87-6cb17498cee6\") " pod="openstack/cinder-db-sync-fgx2c" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.071261 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a79a788b-1b1c-45df-9c90-3c30d382691b-db-sync-config-data\") pod \"barbican-db-sync-mf25h\" (UID: \"a79a788b-1b1c-45df-9c90-3c30d382691b\") " pod="openstack/barbican-db-sync-mf25h" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.071590 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/058a0faa-f3ce-4c0e-b7f0-e3f915f5d887-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"058a0faa-f3ce-4c0e-b7f0-e3f915f5d887\") " pod="openstack/ceilometer-0" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.082750 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qljtl\" (UniqueName: \"kubernetes.io/projected/a79a788b-1b1c-45df-9c90-3c30d382691b-kube-api-access-qljtl\") pod \"barbican-db-sync-mf25h\" (UID: \"a79a788b-1b1c-45df-9c90-3c30d382691b\") " pod="openstack/barbican-db-sync-mf25h" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.084693 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzdkb\" (UniqueName: \"kubernetes.io/projected/a2442edb-5370-4fd9-af87-6cb17498cee6-kube-api-access-xzdkb\") pod \"cinder-db-sync-fgx2c\" (UID: \"a2442edb-5370-4fd9-af87-6cb17498cee6\") " pod="openstack/cinder-db-sync-fgx2c" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.085680 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2442edb-5370-4fd9-af87-6cb17498cee6-combined-ca-bundle\") pod \"cinder-db-sync-fgx2c\" (UID: \"a2442edb-5370-4fd9-af87-6cb17498cee6\") " pod="openstack/cinder-db-sync-fgx2c" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.086238 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/058a0faa-f3ce-4c0e-b7f0-e3f915f5d887-scripts\") pod \"ceilometer-0\" (UID: \"058a0faa-f3ce-4c0e-b7f0-e3f915f5d887\") " pod="openstack/ceilometer-0" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.086583 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a79a788b-1b1c-45df-9c90-3c30d382691b-combined-ca-bundle\") pod \"barbican-db-sync-mf25h\" (UID: \"a79a788b-1b1c-45df-9c90-3c30d382691b\") " pod="openstack/barbican-db-sync-mf25h" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.087129 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2442edb-5370-4fd9-af87-6cb17498cee6-scripts\") pod \"cinder-db-sync-fgx2c\" (UID: \"a2442edb-5370-4fd9-af87-6cb17498cee6\") " pod="openstack/cinder-db-sync-fgx2c" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.087693 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a2442edb-5370-4fd9-af87-6cb17498cee6-db-sync-config-data\") pod \"cinder-db-sync-fgx2c\" (UID: \"a2442edb-5370-4fd9-af87-6cb17498cee6\") " pod="openstack/cinder-db-sync-fgx2c" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.088150 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/058a0faa-f3ce-4c0e-b7f0-e3f915f5d887-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"058a0faa-f3ce-4c0e-b7f0-e3f915f5d887\") " pod="openstack/ceilometer-0" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.092199 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/058a0faa-f3ce-4c0e-b7f0-e3f915f5d887-config-data\") pod \"ceilometer-0\" (UID: \"058a0faa-f3ce-4c0e-b7f0-e3f915f5d887\") " pod="openstack/ceilometer-0" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.093207 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2442edb-5370-4fd9-af87-6cb17498cee6-config-data\") pod \"cinder-db-sync-fgx2c\" (UID: \"a2442edb-5370-4fd9-af87-6cb17498cee6\") " pod="openstack/cinder-db-sync-fgx2c" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.103377 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpbvs\" (UniqueName: \"kubernetes.io/projected/058a0faa-f3ce-4c0e-b7f0-e3f915f5d887-kube-api-access-qpbvs\") pod \"ceilometer-0\" (UID: \"058a0faa-f3ce-4c0e-b7f0-e3f915f5d887\") " pod="openstack/ceilometer-0" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.140188 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-dhclj"] Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.141307 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-dhclj" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.162417 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-98xfd" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.162621 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.162990 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.176932 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e2eecb1-f103-46a7-9f37-5d2259df0703-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-6dplx\" (UID: \"6e2eecb1-f103-46a7-9f37-5d2259df0703\") " pod="openstack/dnsmasq-dns-785d8bcb8c-6dplx" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.177084 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6e2eecb1-f103-46a7-9f37-5d2259df0703-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-6dplx\" (UID: \"6e2eecb1-f103-46a7-9f37-5d2259df0703\") " pod="openstack/dnsmasq-dns-785d8bcb8c-6dplx" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.177163 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e2eecb1-f103-46a7-9f37-5d2259df0703-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-6dplx\" (UID: \"6e2eecb1-f103-46a7-9f37-5d2259df0703\") " pod="openstack/dnsmasq-dns-785d8bcb8c-6dplx" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.177459 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e2eecb1-f103-46a7-9f37-5d2259df0703-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-6dplx\" (UID: \"6e2eecb1-f103-46a7-9f37-5d2259df0703\") " pod="openstack/dnsmasq-dns-785d8bcb8c-6dplx" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.177492 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5cft\" (UniqueName: \"kubernetes.io/projected/6e2eecb1-f103-46a7-9f37-5d2259df0703-kube-api-access-d5cft\") pod \"dnsmasq-dns-785d8bcb8c-6dplx\" (UID: \"6e2eecb1-f103-46a7-9f37-5d2259df0703\") " pod="openstack/dnsmasq-dns-785d8bcb8c-6dplx" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.177544 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e2eecb1-f103-46a7-9f37-5d2259df0703-config\") pod \"dnsmasq-dns-785d8bcb8c-6dplx\" (UID: \"6e2eecb1-f103-46a7-9f37-5d2259df0703\") " pod="openstack/dnsmasq-dns-785d8bcb8c-6dplx" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.179793 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6e2eecb1-f103-46a7-9f37-5d2259df0703-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-6dplx\" (UID: \"6e2eecb1-f103-46a7-9f37-5d2259df0703\") " pod="openstack/dnsmasq-dns-785d8bcb8c-6dplx" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.179686 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-dhclj"] Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.180172 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e2eecb1-f103-46a7-9f37-5d2259df0703-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-6dplx\" (UID: \"6e2eecb1-f103-46a7-9f37-5d2259df0703\") " pod="openstack/dnsmasq-dns-785d8bcb8c-6dplx" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.181162 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e2eecb1-f103-46a7-9f37-5d2259df0703-config\") pod \"dnsmasq-dns-785d8bcb8c-6dplx\" (UID: \"6e2eecb1-f103-46a7-9f37-5d2259df0703\") " pod="openstack/dnsmasq-dns-785d8bcb8c-6dplx" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.182108 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e2eecb1-f103-46a7-9f37-5d2259df0703-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-6dplx\" (UID: \"6e2eecb1-f103-46a7-9f37-5d2259df0703\") " pod="openstack/dnsmasq-dns-785d8bcb8c-6dplx" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.183255 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e2eecb1-f103-46a7-9f37-5d2259df0703-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-6dplx\" (UID: \"6e2eecb1-f103-46a7-9f37-5d2259df0703\") " pod="openstack/dnsmasq-dns-785d8bcb8c-6dplx" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.190971 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-zb5vb"] Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.192365 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zb5vb" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.194390 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-nbqm8" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.194422 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.194424 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.199740 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5cft\" (UniqueName: \"kubernetes.io/projected/6e2eecb1-f103-46a7-9f37-5d2259df0703-kube-api-access-d5cft\") pod \"dnsmasq-dns-785d8bcb8c-6dplx\" (UID: \"6e2eecb1-f103-46a7-9f37-5d2259df0703\") " pod="openstack/dnsmasq-dns-785d8bcb8c-6dplx" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.232734 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-afdd-account-create-6x2nj" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.245052 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-zb5vb"] Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.277877 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fgx2c" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.279339 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b-combined-ca-bundle\") pod \"placement-db-sync-zb5vb\" (UID: \"3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b\") " pod="openstack/placement-db-sync-zb5vb" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.279380 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b-config-data\") pod \"placement-db-sync-zb5vb\" (UID: \"3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b\") " pod="openstack/placement-db-sync-zb5vb" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.279412 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b-scripts\") pod \"placement-db-sync-zb5vb\" (UID: \"3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b\") " pod="openstack/placement-db-sync-zb5vb" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.279508 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5c7883b3-956a-412b-87b7-f7366042440b-config\") pod \"neutron-db-sync-dhclj\" (UID: \"5c7883b3-956a-412b-87b7-f7366042440b\") " pod="openstack/neutron-db-sync-dhclj" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.279561 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b-logs\") pod \"placement-db-sync-zb5vb\" (UID: \"3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b\") " pod="openstack/placement-db-sync-zb5vb" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.279603 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29zst\" (UniqueName: \"kubernetes.io/projected/3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b-kube-api-access-29zst\") pod \"placement-db-sync-zb5vb\" (UID: \"3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b\") " pod="openstack/placement-db-sync-zb5vb" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.279627 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c7883b3-956a-412b-87b7-f7366042440b-combined-ca-bundle\") pod \"neutron-db-sync-dhclj\" (UID: \"5c7883b3-956a-412b-87b7-f7366042440b\") " pod="openstack/neutron-db-sync-dhclj" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.279685 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjjbn\" (UniqueName: \"kubernetes.io/projected/5c7883b3-956a-412b-87b7-f7366042440b-kube-api-access-fjjbn\") pod \"neutron-db-sync-dhclj\" (UID: \"5c7883b3-956a-412b-87b7-f7366042440b\") " pod="openstack/neutron-db-sync-dhclj" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.326876 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.372897 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mf25h" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.381706 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjjbn\" (UniqueName: \"kubernetes.io/projected/5c7883b3-956a-412b-87b7-f7366042440b-kube-api-access-fjjbn\") pod \"neutron-db-sync-dhclj\" (UID: \"5c7883b3-956a-412b-87b7-f7366042440b\") " pod="openstack/neutron-db-sync-dhclj" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.381775 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b-combined-ca-bundle\") pod \"placement-db-sync-zb5vb\" (UID: \"3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b\") " pod="openstack/placement-db-sync-zb5vb" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.381808 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b-config-data\") pod \"placement-db-sync-zb5vb\" (UID: \"3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b\") " pod="openstack/placement-db-sync-zb5vb" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.381831 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b-scripts\") pod \"placement-db-sync-zb5vb\" (UID: \"3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b\") " pod="openstack/placement-db-sync-zb5vb" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.381910 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5c7883b3-956a-412b-87b7-f7366042440b-config\") pod \"neutron-db-sync-dhclj\" (UID: \"5c7883b3-956a-412b-87b7-f7366042440b\") " pod="openstack/neutron-db-sync-dhclj" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.381962 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b-logs\") pod \"placement-db-sync-zb5vb\" (UID: \"3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b\") " pod="openstack/placement-db-sync-zb5vb" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.382003 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29zst\" (UniqueName: \"kubernetes.io/projected/3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b-kube-api-access-29zst\") pod \"placement-db-sync-zb5vb\" (UID: \"3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b\") " pod="openstack/placement-db-sync-zb5vb" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.382027 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c7883b3-956a-412b-87b7-f7366042440b-combined-ca-bundle\") pod \"neutron-db-sync-dhclj\" (UID: \"5c7883b3-956a-412b-87b7-f7366042440b\") " pod="openstack/neutron-db-sync-dhclj" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.382508 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b-logs\") pod \"placement-db-sync-zb5vb\" (UID: \"3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b\") " pod="openstack/placement-db-sync-zb5vb" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.389593 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-6dplx" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.397646 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b-scripts\") pod \"placement-db-sync-zb5vb\" (UID: \"3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b\") " pod="openstack/placement-db-sync-zb5vb" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.404891 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c7883b3-956a-412b-87b7-f7366042440b-combined-ca-bundle\") pod \"neutron-db-sync-dhclj\" (UID: \"5c7883b3-956a-412b-87b7-f7366042440b\") " pod="openstack/neutron-db-sync-dhclj" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.410149 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29zst\" (UniqueName: \"kubernetes.io/projected/3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b-kube-api-access-29zst\") pod \"placement-db-sync-zb5vb\" (UID: \"3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b\") " pod="openstack/placement-db-sync-zb5vb" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.410381 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b-config-data\") pod \"placement-db-sync-zb5vb\" (UID: \"3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b\") " pod="openstack/placement-db-sync-zb5vb" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.411894 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b-combined-ca-bundle\") pod \"placement-db-sync-zb5vb\" (UID: \"3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b\") " pod="openstack/placement-db-sync-zb5vb" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.412765 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5c7883b3-956a-412b-87b7-f7366042440b-config\") pod \"neutron-db-sync-dhclj\" (UID: \"5c7883b3-956a-412b-87b7-f7366042440b\") " pod="openstack/neutron-db-sync-dhclj" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.414930 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjjbn\" (UniqueName: \"kubernetes.io/projected/5c7883b3-956a-412b-87b7-f7366042440b-kube-api-access-fjjbn\") pod \"neutron-db-sync-dhclj\" (UID: \"5c7883b3-956a-412b-87b7-f7366042440b\") " pod="openstack/neutron-db-sync-dhclj" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.472323 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-2zpw6"] Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.484884 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-dhclj" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.526632 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zb5vb" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.616369 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-snnx9"] Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.628711 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.630092 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.637555 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-create-plg9d"] Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.641404 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.641841 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.641997 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.642124 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-4kqxm" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.647715 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.690782 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00e712f4-ba35-46e1-8ff2-3e1eb7615e69-logs\") pod \"glance-default-external-api-0\" (UID: \"00e712f4-ba35-46e1-8ff2-3e1eb7615e69\") " pod="openstack/glance-default-external-api-0" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.690856 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00e712f4-ba35-46e1-8ff2-3e1eb7615e69-config-data\") pod \"glance-default-external-api-0\" (UID: \"00e712f4-ba35-46e1-8ff2-3e1eb7615e69\") " pod="openstack/glance-default-external-api-0" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.690957 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"00e712f4-ba35-46e1-8ff2-3e1eb7615e69\") " pod="openstack/glance-default-external-api-0" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.691008 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx665\" (UniqueName: \"kubernetes.io/projected/00e712f4-ba35-46e1-8ff2-3e1eb7615e69-kube-api-access-cx665\") pod \"glance-default-external-api-0\" (UID: \"00e712f4-ba35-46e1-8ff2-3e1eb7615e69\") " pod="openstack/glance-default-external-api-0" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.691111 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/00e712f4-ba35-46e1-8ff2-3e1eb7615e69-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"00e712f4-ba35-46e1-8ff2-3e1eb7615e69\") " pod="openstack/glance-default-external-api-0" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.692039 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00e712f4-ba35-46e1-8ff2-3e1eb7615e69-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"00e712f4-ba35-46e1-8ff2-3e1eb7615e69\") " pod="openstack/glance-default-external-api-0" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.692104 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/00e712f4-ba35-46e1-8ff2-3e1eb7615e69-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"00e712f4-ba35-46e1-8ff2-3e1eb7615e69\") " pod="openstack/glance-default-external-api-0" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.692189 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00e712f4-ba35-46e1-8ff2-3e1eb7615e69-scripts\") pod \"glance-default-external-api-0\" (UID: \"00e712f4-ba35-46e1-8ff2-3e1eb7615e69\") " pod="openstack/glance-default-external-api-0" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.743214 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.744737 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.746762 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.747090 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.759127 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.766996 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-afdd-account-create-6x2nj"] Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.794916 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/00e712f4-ba35-46e1-8ff2-3e1eb7615e69-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"00e712f4-ba35-46e1-8ff2-3e1eb7615e69\") " pod="openstack/glance-default-external-api-0" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.794977 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00e712f4-ba35-46e1-8ff2-3e1eb7615e69-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"00e712f4-ba35-46e1-8ff2-3e1eb7615e69\") " pod="openstack/glance-default-external-api-0" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.795033 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/00e712f4-ba35-46e1-8ff2-3e1eb7615e69-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"00e712f4-ba35-46e1-8ff2-3e1eb7615e69\") " pod="openstack/glance-default-external-api-0" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.795103 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00e712f4-ba35-46e1-8ff2-3e1eb7615e69-scripts\") pod \"glance-default-external-api-0\" (UID: \"00e712f4-ba35-46e1-8ff2-3e1eb7615e69\") " pod="openstack/glance-default-external-api-0" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.795145 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00e712f4-ba35-46e1-8ff2-3e1eb7615e69-logs\") pod \"glance-default-external-api-0\" (UID: \"00e712f4-ba35-46e1-8ff2-3e1eb7615e69\") " pod="openstack/glance-default-external-api-0" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.795180 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00e712f4-ba35-46e1-8ff2-3e1eb7615e69-config-data\") pod \"glance-default-external-api-0\" (UID: \"00e712f4-ba35-46e1-8ff2-3e1eb7615e69\") " pod="openstack/glance-default-external-api-0" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.795270 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"00e712f4-ba35-46e1-8ff2-3e1eb7615e69\") " pod="openstack/glance-default-external-api-0" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.795340 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cx665\" (UniqueName: \"kubernetes.io/projected/00e712f4-ba35-46e1-8ff2-3e1eb7615e69-kube-api-access-cx665\") pod \"glance-default-external-api-0\" (UID: \"00e712f4-ba35-46e1-8ff2-3e1eb7615e69\") " pod="openstack/glance-default-external-api-0" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.799679 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/00e712f4-ba35-46e1-8ff2-3e1eb7615e69-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"00e712f4-ba35-46e1-8ff2-3e1eb7615e69\") " pod="openstack/glance-default-external-api-0" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.802644 4699 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"00e712f4-ba35-46e1-8ff2-3e1eb7615e69\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-external-api-0" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.803752 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00e712f4-ba35-46e1-8ff2-3e1eb7615e69-logs\") pod \"glance-default-external-api-0\" (UID: \"00e712f4-ba35-46e1-8ff2-3e1eb7615e69\") " pod="openstack/glance-default-external-api-0" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.820456 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00e712f4-ba35-46e1-8ff2-3e1eb7615e69-scripts\") pod \"glance-default-external-api-0\" (UID: \"00e712f4-ba35-46e1-8ff2-3e1eb7615e69\") " pod="openstack/glance-default-external-api-0" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.821225 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx665\" (UniqueName: \"kubernetes.io/projected/00e712f4-ba35-46e1-8ff2-3e1eb7615e69-kube-api-access-cx665\") pod \"glance-default-external-api-0\" (UID: \"00e712f4-ba35-46e1-8ff2-3e1eb7615e69\") " pod="openstack/glance-default-external-api-0" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.831916 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00e712f4-ba35-46e1-8ff2-3e1eb7615e69-config-data\") pod \"glance-default-external-api-0\" (UID: \"00e712f4-ba35-46e1-8ff2-3e1eb7615e69\") " pod="openstack/glance-default-external-api-0" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.834275 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00e712f4-ba35-46e1-8ff2-3e1eb7615e69-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"00e712f4-ba35-46e1-8ff2-3e1eb7615e69\") " pod="openstack/glance-default-external-api-0" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.839217 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/00e712f4-ba35-46e1-8ff2-3e1eb7615e69-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"00e712f4-ba35-46e1-8ff2-3e1eb7615e69\") " pod="openstack/glance-default-external-api-0" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.897329 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"00e712f4-ba35-46e1-8ff2-3e1eb7615e69\") " pod="openstack/glance-default-external-api-0" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.897383 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06d61fb1-732e-4e73-a859-b94c63838a8a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"06d61fb1-732e-4e73-a859-b94c63838a8a\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.897832 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"06d61fb1-732e-4e73-a859-b94c63838a8a\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.897899 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06d61fb1-732e-4e73-a859-b94c63838a8a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"06d61fb1-732e-4e73-a859-b94c63838a8a\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.897981 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/06d61fb1-732e-4e73-a859-b94c63838a8a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"06d61fb1-732e-4e73-a859-b94c63838a8a\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.898008 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06d61fb1-732e-4e73-a859-b94c63838a8a-logs\") pod \"glance-default-internal-api-0\" (UID: \"06d61fb1-732e-4e73-a859-b94c63838a8a\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.898035 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fdjt\" (UniqueName: \"kubernetes.io/projected/06d61fb1-732e-4e73-a859-b94c63838a8a-kube-api-access-2fdjt\") pod \"glance-default-internal-api-0\" (UID: \"06d61fb1-732e-4e73-a859-b94c63838a8a\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.898190 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/06d61fb1-732e-4e73-a859-b94c63838a8a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"06d61fb1-732e-4e73-a859-b94c63838a8a\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:26:34 crc kubenswrapper[4699]: I1122 04:26:34.898314 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06d61fb1-732e-4e73-a859-b94c63838a8a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"06d61fb1-732e-4e73-a859-b94c63838a8a\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:34.948598 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:34.957972 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-fgx2c"] Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:34.960179 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:35.003489 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/06d61fb1-732e-4e73-a859-b94c63838a8a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"06d61fb1-732e-4e73-a859-b94c63838a8a\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:35.003545 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06d61fb1-732e-4e73-a859-b94c63838a8a-logs\") pod \"glance-default-internal-api-0\" (UID: \"06d61fb1-732e-4e73-a859-b94c63838a8a\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:35.003580 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fdjt\" (UniqueName: \"kubernetes.io/projected/06d61fb1-732e-4e73-a859-b94c63838a8a-kube-api-access-2fdjt\") pod \"glance-default-internal-api-0\" (UID: \"06d61fb1-732e-4e73-a859-b94c63838a8a\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:35.003657 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/06d61fb1-732e-4e73-a859-b94c63838a8a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"06d61fb1-732e-4e73-a859-b94c63838a8a\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:35.003727 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06d61fb1-732e-4e73-a859-b94c63838a8a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"06d61fb1-732e-4e73-a859-b94c63838a8a\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:35.003768 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06d61fb1-732e-4e73-a859-b94c63838a8a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"06d61fb1-732e-4e73-a859-b94c63838a8a\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:35.003800 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"06d61fb1-732e-4e73-a859-b94c63838a8a\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:35.003835 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06d61fb1-732e-4e73-a859-b94c63838a8a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"06d61fb1-732e-4e73-a859-b94c63838a8a\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:35.008779 4699 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"06d61fb1-732e-4e73-a859-b94c63838a8a\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:35.009367 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06d61fb1-732e-4e73-a859-b94c63838a8a-logs\") pod \"glance-default-internal-api-0\" (UID: \"06d61fb1-732e-4e73-a859-b94c63838a8a\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:35.011168 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/06d61fb1-732e-4e73-a859-b94c63838a8a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"06d61fb1-732e-4e73-a859-b94c63838a8a\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:35.014406 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06d61fb1-732e-4e73-a859-b94c63838a8a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"06d61fb1-732e-4e73-a859-b94c63838a8a\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:35.014703 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/06d61fb1-732e-4e73-a859-b94c63838a8a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"06d61fb1-732e-4e73-a859-b94c63838a8a\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:35.015795 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06d61fb1-732e-4e73-a859-b94c63838a8a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"06d61fb1-732e-4e73-a859-b94c63838a8a\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:35.030801 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06d61fb1-732e-4e73-a859-b94c63838a8a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"06d61fb1-732e-4e73-a859-b94c63838a8a\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:35.037832 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fdjt\" (UniqueName: \"kubernetes.io/projected/06d61fb1-732e-4e73-a859-b94c63838a8a-kube-api-access-2fdjt\") pod \"glance-default-internal-api-0\" (UID: \"06d61fb1-732e-4e73-a859-b94c63838a8a\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:35.120832 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-mf25h"] Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:35.127889 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"06d61fb1-732e-4e73-a859-b94c63838a8a\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:26:38 crc kubenswrapper[4699]: W1122 04:26:35.144783 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda79a788b_1b1c_45df_9c90_3c30d382691b.slice/crio-f1f8dc05102539581467cfc1e4f27a6a37f142fa43b9e81ceb746a7f002cc336 WatchSource:0}: Error finding container f1f8dc05102539581467cfc1e4f27a6a37f142fa43b9e81ceb746a7f002cc336: Status 404 returned error can't find the container with id f1f8dc05102539581467cfc1e4f27a6a37f142fa43b9e81ceb746a7f002cc336 Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:35.194648 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mf25h" event={"ID":"a79a788b-1b1c-45df-9c90-3c30d382691b","Type":"ContainerStarted","Data":"f1f8dc05102539581467cfc1e4f27a6a37f142fa43b9e81ceb746a7f002cc336"} Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:35.196747 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fgx2c" event={"ID":"a2442edb-5370-4fd9-af87-6cb17498cee6","Type":"ContainerStarted","Data":"70f64905ce8944a01a329b0de4744e0daff6439506d70264c41dd17374b711da"} Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:35.197904 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-afdd-account-create-6x2nj" event={"ID":"4040846b-26da-4123-b778-0115b8c5e6da","Type":"ContainerStarted","Data":"9a3babea702eb4cc36c855edeb557214693c7504cbde5e136dc60ffafd210b03"} Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:35.199094 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-create-plg9d" event={"ID":"0cb04579-e9a4-4139-8b94-4ce96f466397","Type":"ContainerStarted","Data":"c69492916d49ad2b40292c8b35da50dffc2bddad0b3603c4911975604a3fdaed"} Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:35.200148 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-2zpw6" event={"ID":"4cfccecc-0040-47e6-ad18-c905d4a5e097","Type":"ContainerStarted","Data":"9d4407a8469e89e823bd4cdd4f134a8ec95af08185b1e96f3641f3f1d5f3dad5"} Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:35.200985 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-snnx9" event={"ID":"f186fd49-86df-4a93-8c3d-d068660fe92b","Type":"ContainerStarted","Data":"89784d4e276d26c1a74df1723882501f8ace5fc33ee23ed8884a5570c795d4bb"} Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:35.202275 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"058a0faa-f3ce-4c0e-b7f0-e3f915f5d887","Type":"ContainerStarted","Data":"a0e358ebdc8c954c0d6b11d9df593c8e636eee1df6225d1c57661f184da6d392"} Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:35.310641 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:35.750295 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:35.847147 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:36.212397 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-create-plg9d" event={"ID":"0cb04579-e9a4-4139-8b94-4ce96f466397","Type":"ContainerStarted","Data":"5af0e9b3d6eb36d09b86a7f1b03bd2813edaffcae4977d0f097f41890f77b98d"} Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:36.214835 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-2zpw6" event={"ID":"4cfccecc-0040-47e6-ad18-c905d4a5e097","Type":"ContainerStarted","Data":"d6ea9d4338bb507602d62ed199b9c14ba0a5732da4e62e183add16afb85feb2a"} Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:36.217081 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-snnx9" event={"ID":"f186fd49-86df-4a93-8c3d-d068660fe92b","Type":"ContainerStarted","Data":"1181a5b24efb2053c968a4aee3c0605356cb691d1a864b4dbc90d78b5c41283e"} Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:36.218274 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-afdd-account-create-6x2nj" event={"ID":"4040846b-26da-4123-b778-0115b8c5e6da","Type":"ContainerStarted","Data":"7857f769493c994617491a9656ac2379522e640053b0b678286edddd2d776b2e"} Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:36.238284 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-db-create-plg9d" podStartSLOduration=3.238268065 podStartE2EDuration="3.238268065s" podCreationTimestamp="2025-11-22 04:26:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:26:36.223989299 +0000 UTC m=+1147.566610486" watchObservedRunningTime="2025-11-22 04:26:36.238268065 +0000 UTC m=+1147.580889252" Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:36.249728 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-afdd-account-create-6x2nj" podStartSLOduration=3.249704271 podStartE2EDuration="3.249704271s" podCreationTimestamp="2025-11-22 04:26:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:26:36.237860905 +0000 UTC m=+1147.580482092" watchObservedRunningTime="2025-11-22 04:26:36.249704271 +0000 UTC m=+1147.592325458" Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:36.264502 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-snnx9" podStartSLOduration=3.264481389 podStartE2EDuration="3.264481389s" podCreationTimestamp="2025-11-22 04:26:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:26:36.259728914 +0000 UTC m=+1147.602350101" watchObservedRunningTime="2025-11-22 04:26:36.264481389 +0000 UTC m=+1147.607102576" Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:37.241224 4699 generic.go:334] "Generic (PLEG): container finished" podID="4cfccecc-0040-47e6-ad18-c905d4a5e097" containerID="d6ea9d4338bb507602d62ed199b9c14ba0a5732da4e62e183add16afb85feb2a" exitCode=0 Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:37.241330 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-2zpw6" event={"ID":"4cfccecc-0040-47e6-ad18-c905d4a5e097","Type":"ContainerDied","Data":"d6ea9d4338bb507602d62ed199b9c14ba0a5732da4e62e183add16afb85feb2a"} Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:38.252940 4699 generic.go:334] "Generic (PLEG): container finished" podID="4040846b-26da-4123-b778-0115b8c5e6da" containerID="7857f769493c994617491a9656ac2379522e640053b0b678286edddd2d776b2e" exitCode=0 Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:38.252981 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-afdd-account-create-6x2nj" event={"ID":"4040846b-26da-4123-b778-0115b8c5e6da","Type":"ContainerDied","Data":"7857f769493c994617491a9656ac2379522e640053b0b678286edddd2d776b2e"} Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:38.255166 4699 generic.go:334] "Generic (PLEG): container finished" podID="0cb04579-e9a4-4139-8b94-4ce96f466397" containerID="5af0e9b3d6eb36d09b86a7f1b03bd2813edaffcae4977d0f097f41890f77b98d" exitCode=0 Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:38.255211 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-create-plg9d" event={"ID":"0cb04579-e9a4-4139-8b94-4ce96f466397","Type":"ContainerDied","Data":"5af0e9b3d6eb36d09b86a7f1b03bd2813edaffcae4977d0f097f41890f77b98d"} Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:38.554272 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:38.561788 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-dhclj"] Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:38.605343 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-6dplx"] Nov 22 04:26:38 crc kubenswrapper[4699]: W1122 04:26:38.625666 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e2eecb1_f103_46a7_9f37_5d2259df0703.slice/crio-68b223ea71390f4c1e9075f18ffc88a9f106ce78361a1cf478621e5c25407b7e WatchSource:0}: Error finding container 68b223ea71390f4c1e9075f18ffc88a9f106ce78361a1cf478621e5c25407b7e: Status 404 returned error can't find the container with id 68b223ea71390f4c1e9075f18ffc88a9f106ce78361a1cf478621e5c25407b7e Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:38.742329 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-zb5vb"] Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:38.809521 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-2zpw6" Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:38.890066 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cfccecc-0040-47e6-ad18-c905d4a5e097-dns-svc\") pod \"4cfccecc-0040-47e6-ad18-c905d4a5e097\" (UID: \"4cfccecc-0040-47e6-ad18-c905d4a5e097\") " Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:38.890118 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4cfccecc-0040-47e6-ad18-c905d4a5e097-dns-swift-storage-0\") pod \"4cfccecc-0040-47e6-ad18-c905d4a5e097\" (UID: \"4cfccecc-0040-47e6-ad18-c905d4a5e097\") " Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:38.890170 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpctd\" (UniqueName: \"kubernetes.io/projected/4cfccecc-0040-47e6-ad18-c905d4a5e097-kube-api-access-vpctd\") pod \"4cfccecc-0040-47e6-ad18-c905d4a5e097\" (UID: \"4cfccecc-0040-47e6-ad18-c905d4a5e097\") " Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:38.890267 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cfccecc-0040-47e6-ad18-c905d4a5e097-ovsdbserver-sb\") pod \"4cfccecc-0040-47e6-ad18-c905d4a5e097\" (UID: \"4cfccecc-0040-47e6-ad18-c905d4a5e097\") " Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:38.890304 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cfccecc-0040-47e6-ad18-c905d4a5e097-ovsdbserver-nb\") pod \"4cfccecc-0040-47e6-ad18-c905d4a5e097\" (UID: \"4cfccecc-0040-47e6-ad18-c905d4a5e097\") " Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:38.890366 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cfccecc-0040-47e6-ad18-c905d4a5e097-config\") pod \"4cfccecc-0040-47e6-ad18-c905d4a5e097\" (UID: \"4cfccecc-0040-47e6-ad18-c905d4a5e097\") " Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:38.916734 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cfccecc-0040-47e6-ad18-c905d4a5e097-kube-api-access-vpctd" (OuterVolumeSpecName: "kube-api-access-vpctd") pod "4cfccecc-0040-47e6-ad18-c905d4a5e097" (UID: "4cfccecc-0040-47e6-ad18-c905d4a5e097"). InnerVolumeSpecName "kube-api-access-vpctd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:38.931278 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 04:26:38 crc kubenswrapper[4699]: W1122 04:26:38.953341 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06d61fb1_732e_4e73_a859_b94c63838a8a.slice/crio-e0f87c3418e69470ea3eeebe9da556893834dd0b165d44566c95b126f3d51887 WatchSource:0}: Error finding container e0f87c3418e69470ea3eeebe9da556893834dd0b165d44566c95b126f3d51887: Status 404 returned error can't find the container with id e0f87c3418e69470ea3eeebe9da556893834dd0b165d44566c95b126f3d51887 Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:38.954396 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cfccecc-0040-47e6-ad18-c905d4a5e097-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4cfccecc-0040-47e6-ad18-c905d4a5e097" (UID: "4cfccecc-0040-47e6-ad18-c905d4a5e097"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:38.985134 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cfccecc-0040-47e6-ad18-c905d4a5e097-config" (OuterVolumeSpecName: "config") pod "4cfccecc-0040-47e6-ad18-c905d4a5e097" (UID: "4cfccecc-0040-47e6-ad18-c905d4a5e097"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:38.987013 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cfccecc-0040-47e6-ad18-c905d4a5e097-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4cfccecc-0040-47e6-ad18-c905d4a5e097" (UID: "4cfccecc-0040-47e6-ad18-c905d4a5e097"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:38.991797 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cfccecc-0040-47e6-ad18-c905d4a5e097-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4cfccecc-0040-47e6-ad18-c905d4a5e097" (UID: "4cfccecc-0040-47e6-ad18-c905d4a5e097"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:38.992207 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cfccecc-0040-47e6-ad18-c905d4a5e097-ovsdbserver-sb\") pod \"4cfccecc-0040-47e6-ad18-c905d4a5e097\" (UID: \"4cfccecc-0040-47e6-ad18-c905d4a5e097\") " Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:38.992933 4699 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4cfccecc-0040-47e6-ad18-c905d4a5e097-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:38.992957 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpctd\" (UniqueName: \"kubernetes.io/projected/4cfccecc-0040-47e6-ad18-c905d4a5e097-kube-api-access-vpctd\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:38.992973 4699 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4cfccecc-0040-47e6-ad18-c905d4a5e097-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:38.992985 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4cfccecc-0040-47e6-ad18-c905d4a5e097-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:38 crc kubenswrapper[4699]: W1122 04:26:38.993276 4699 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/4cfccecc-0040-47e6-ad18-c905d4a5e097/volumes/kubernetes.io~configmap/ovsdbserver-sb Nov 22 04:26:38 crc kubenswrapper[4699]: I1122 04:26:38.993297 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cfccecc-0040-47e6-ad18-c905d4a5e097-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4cfccecc-0040-47e6-ad18-c905d4a5e097" (UID: "4cfccecc-0040-47e6-ad18-c905d4a5e097"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:26:39 crc kubenswrapper[4699]: I1122 04:26:39.009682 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cfccecc-0040-47e6-ad18-c905d4a5e097-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4cfccecc-0040-47e6-ad18-c905d4a5e097" (UID: "4cfccecc-0040-47e6-ad18-c905d4a5e097"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:26:39 crc kubenswrapper[4699]: I1122 04:26:39.095255 4699 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4cfccecc-0040-47e6-ad18-c905d4a5e097-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:39 crc kubenswrapper[4699]: I1122 04:26:39.095291 4699 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4cfccecc-0040-47e6-ad18-c905d4a5e097-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:39 crc kubenswrapper[4699]: I1122 04:26:39.267741 4699 generic.go:334] "Generic (PLEG): container finished" podID="6e2eecb1-f103-46a7-9f37-5d2259df0703" containerID="05316b81d7e4221f94e481b7fa2ac5f38641454bda7c23f22e789d2d7af2e13a" exitCode=0 Nov 22 04:26:39 crc kubenswrapper[4699]: I1122 04:26:39.267864 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-6dplx" event={"ID":"6e2eecb1-f103-46a7-9f37-5d2259df0703","Type":"ContainerDied","Data":"05316b81d7e4221f94e481b7fa2ac5f38641454bda7c23f22e789d2d7af2e13a"} Nov 22 04:26:39 crc kubenswrapper[4699]: I1122 04:26:39.267901 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-6dplx" event={"ID":"6e2eecb1-f103-46a7-9f37-5d2259df0703","Type":"ContainerStarted","Data":"68b223ea71390f4c1e9075f18ffc88a9f106ce78361a1cf478621e5c25407b7e"} Nov 22 04:26:39 crc kubenswrapper[4699]: I1122 04:26:39.273521 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-2zpw6" event={"ID":"4cfccecc-0040-47e6-ad18-c905d4a5e097","Type":"ContainerDied","Data":"9d4407a8469e89e823bd4cdd4f134a8ec95af08185b1e96f3641f3f1d5f3dad5"} Nov 22 04:26:39 crc kubenswrapper[4699]: I1122 04:26:39.273594 4699 scope.go:117] "RemoveContainer" containerID="d6ea9d4338bb507602d62ed199b9c14ba0a5732da4e62e183add16afb85feb2a" Nov 22 04:26:39 crc kubenswrapper[4699]: I1122 04:26:39.273709 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-2zpw6" Nov 22 04:26:39 crc kubenswrapper[4699]: I1122 04:26:39.289254 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zb5vb" event={"ID":"3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b","Type":"ContainerStarted","Data":"476ef44dd4a9f643bb9383a43667c19f3733e804c8f1eb13e3696f0193414f23"} Nov 22 04:26:39 crc kubenswrapper[4699]: I1122 04:26:39.292139 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"06d61fb1-732e-4e73-a859-b94c63838a8a","Type":"ContainerStarted","Data":"e0f87c3418e69470ea3eeebe9da556893834dd0b165d44566c95b126f3d51887"} Nov 22 04:26:39 crc kubenswrapper[4699]: I1122 04:26:39.296822 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-dhclj" event={"ID":"5c7883b3-956a-412b-87b7-f7366042440b","Type":"ContainerStarted","Data":"f89adf7b40fa021a0f196d11b77ffea1655dd93826c756566309dfba9ac95472"} Nov 22 04:26:39 crc kubenswrapper[4699]: I1122 04:26:39.296901 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-dhclj" event={"ID":"5c7883b3-956a-412b-87b7-f7366042440b","Type":"ContainerStarted","Data":"c4cbbbf8bef678e23067891780026b13f45888a86a8cfab0aa6b9b8c98866377"} Nov 22 04:26:39 crc kubenswrapper[4699]: I1122 04:26:39.315811 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-dhclj" podStartSLOduration=6.315759225 podStartE2EDuration="6.315759225s" podCreationTimestamp="2025-11-22 04:26:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:26:39.314776351 +0000 UTC m=+1150.657397538" watchObservedRunningTime="2025-11-22 04:26:39.315759225 +0000 UTC m=+1150.658380412" Nov 22 04:26:39 crc kubenswrapper[4699]: I1122 04:26:39.380546 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-2zpw6"] Nov 22 04:26:39 crc kubenswrapper[4699]: I1122 04:26:39.390649 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-2zpw6"] Nov 22 04:26:39 crc kubenswrapper[4699]: I1122 04:26:39.463537 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cfccecc-0040-47e6-ad18-c905d4a5e097" path="/var/lib/kubelet/pods/4cfccecc-0040-47e6-ad18-c905d4a5e097/volumes" Nov 22 04:26:39 crc kubenswrapper[4699]: I1122 04:26:39.539599 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 04:26:40 crc kubenswrapper[4699]: I1122 04:26:40.313547 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"06d61fb1-732e-4e73-a859-b94c63838a8a","Type":"ContainerStarted","Data":"7c129a5b76f2f8ffc6901291e9e5f0b872b96d5cfb2ea850f76aaeeb877607e1"} Nov 22 04:26:42 crc kubenswrapper[4699]: I1122 04:26:42.333139 4699 generic.go:334] "Generic (PLEG): container finished" podID="f186fd49-86df-4a93-8c3d-d068660fe92b" containerID="1181a5b24efb2053c968a4aee3c0605356cb691d1a864b4dbc90d78b5c41283e" exitCode=0 Nov 22 04:26:42 crc kubenswrapper[4699]: I1122 04:26:42.333243 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-snnx9" event={"ID":"f186fd49-86df-4a93-8c3d-d068660fe92b","Type":"ContainerDied","Data":"1181a5b24efb2053c968a4aee3c0605356cb691d1a864b4dbc90d78b5c41283e"} Nov 22 04:26:43 crc kubenswrapper[4699]: I1122 04:26:43.071288 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-afdd-account-create-6x2nj" Nov 22 04:26:43 crc kubenswrapper[4699]: I1122 04:26:43.077271 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-plg9d" Nov 22 04:26:43 crc kubenswrapper[4699]: I1122 04:26:43.208611 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4040846b-26da-4123-b778-0115b8c5e6da-operator-scripts\") pod \"4040846b-26da-4123-b778-0115b8c5e6da\" (UID: \"4040846b-26da-4123-b778-0115b8c5e6da\") " Nov 22 04:26:43 crc kubenswrapper[4699]: I1122 04:26:43.208981 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0cb04579-e9a4-4139-8b94-4ce96f466397-operator-scripts\") pod \"0cb04579-e9a4-4139-8b94-4ce96f466397\" (UID: \"0cb04579-e9a4-4139-8b94-4ce96f466397\") " Nov 22 04:26:43 crc kubenswrapper[4699]: I1122 04:26:43.209027 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-km2n4\" (UniqueName: \"kubernetes.io/projected/0cb04579-e9a4-4139-8b94-4ce96f466397-kube-api-access-km2n4\") pod \"0cb04579-e9a4-4139-8b94-4ce96f466397\" (UID: \"0cb04579-e9a4-4139-8b94-4ce96f466397\") " Nov 22 04:26:43 crc kubenswrapper[4699]: I1122 04:26:43.209386 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x24z8\" (UniqueName: \"kubernetes.io/projected/4040846b-26da-4123-b778-0115b8c5e6da-kube-api-access-x24z8\") pod \"4040846b-26da-4123-b778-0115b8c5e6da\" (UID: \"4040846b-26da-4123-b778-0115b8c5e6da\") " Nov 22 04:26:43 crc kubenswrapper[4699]: I1122 04:26:43.209493 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4040846b-26da-4123-b778-0115b8c5e6da-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4040846b-26da-4123-b778-0115b8c5e6da" (UID: "4040846b-26da-4123-b778-0115b8c5e6da"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:26:43 crc kubenswrapper[4699]: I1122 04:26:43.209746 4699 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4040846b-26da-4123-b778-0115b8c5e6da-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:43 crc kubenswrapper[4699]: I1122 04:26:43.209856 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cb04579-e9a4-4139-8b94-4ce96f466397-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0cb04579-e9a4-4139-8b94-4ce96f466397" (UID: "0cb04579-e9a4-4139-8b94-4ce96f466397"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:26:43 crc kubenswrapper[4699]: I1122 04:26:43.216646 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4040846b-26da-4123-b778-0115b8c5e6da-kube-api-access-x24z8" (OuterVolumeSpecName: "kube-api-access-x24z8") pod "4040846b-26da-4123-b778-0115b8c5e6da" (UID: "4040846b-26da-4123-b778-0115b8c5e6da"). InnerVolumeSpecName "kube-api-access-x24z8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:26:43 crc kubenswrapper[4699]: I1122 04:26:43.216772 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cb04579-e9a4-4139-8b94-4ce96f466397-kube-api-access-km2n4" (OuterVolumeSpecName: "kube-api-access-km2n4") pod "0cb04579-e9a4-4139-8b94-4ce96f466397" (UID: "0cb04579-e9a4-4139-8b94-4ce96f466397"). InnerVolumeSpecName "kube-api-access-km2n4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:26:43 crc kubenswrapper[4699]: I1122 04:26:43.311855 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x24z8\" (UniqueName: \"kubernetes.io/projected/4040846b-26da-4123-b778-0115b8c5e6da-kube-api-access-x24z8\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:43 crc kubenswrapper[4699]: I1122 04:26:43.311901 4699 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0cb04579-e9a4-4139-8b94-4ce96f466397-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:43 crc kubenswrapper[4699]: I1122 04:26:43.311914 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-km2n4\" (UniqueName: \"kubernetes.io/projected/0cb04579-e9a4-4139-8b94-4ce96f466397-kube-api-access-km2n4\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:43 crc kubenswrapper[4699]: I1122 04:26:43.345136 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-afdd-account-create-6x2nj" event={"ID":"4040846b-26da-4123-b778-0115b8c5e6da","Type":"ContainerDied","Data":"9a3babea702eb4cc36c855edeb557214693c7504cbde5e136dc60ffafd210b03"} Nov 22 04:26:43 crc kubenswrapper[4699]: I1122 04:26:43.345182 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a3babea702eb4cc36c855edeb557214693c7504cbde5e136dc60ffafd210b03" Nov 22 04:26:43 crc kubenswrapper[4699]: I1122 04:26:43.345240 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-afdd-account-create-6x2nj" Nov 22 04:26:43 crc kubenswrapper[4699]: I1122 04:26:43.351853 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-create-plg9d" event={"ID":"0cb04579-e9a4-4139-8b94-4ce96f466397","Type":"ContainerDied","Data":"c69492916d49ad2b40292c8b35da50dffc2bddad0b3603c4911975604a3fdaed"} Nov 22 04:26:43 crc kubenswrapper[4699]: I1122 04:26:43.351890 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-plg9d" Nov 22 04:26:43 crc kubenswrapper[4699]: I1122 04:26:43.351895 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c69492916d49ad2b40292c8b35da50dffc2bddad0b3603c4911975604a3fdaed" Nov 22 04:26:43 crc kubenswrapper[4699]: I1122 04:26:43.363426 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-6dplx" event={"ID":"6e2eecb1-f103-46a7-9f37-5d2259df0703","Type":"ContainerStarted","Data":"39015ecde0bec245bc5a9f0feade583fd348c1a4f7c5ba8df442d6ffdb40e3ee"} Nov 22 04:26:43 crc kubenswrapper[4699]: I1122 04:26:43.363523 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-6dplx" Nov 22 04:26:43 crc kubenswrapper[4699]: I1122 04:26:43.386837 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-6dplx" podStartSLOduration=10.386814105 podStartE2EDuration="10.386814105s" podCreationTimestamp="2025-11-22 04:26:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:26:43.380120243 +0000 UTC m=+1154.722741440" watchObservedRunningTime="2025-11-22 04:26:43.386814105 +0000 UTC m=+1154.729435292" Nov 22 04:26:45 crc kubenswrapper[4699]: W1122 04:26:45.899712 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00e712f4_ba35_46e1_8ff2_3e1eb7615e69.slice/crio-9f331d1ae4c1f9bd39ad77ec1b032f47c0488d65f9131d02d701da0a68172542 WatchSource:0}: Error finding container 9f331d1ae4c1f9bd39ad77ec1b032f47c0488d65f9131d02d701da0a68172542: Status 404 returned error can't find the container with id 9f331d1ae4c1f9bd39ad77ec1b032f47c0488d65f9131d02d701da0a68172542 Nov 22 04:26:46 crc kubenswrapper[4699]: I1122 04:26:46.386236 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"00e712f4-ba35-46e1-8ff2-3e1eb7615e69","Type":"ContainerStarted","Data":"9f331d1ae4c1f9bd39ad77ec1b032f47c0488d65f9131d02d701da0a68172542"} Nov 22 04:26:48 crc kubenswrapper[4699]: I1122 04:26:48.126782 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-snnx9" Nov 22 04:26:48 crc kubenswrapper[4699]: I1122 04:26:48.294899 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f186fd49-86df-4a93-8c3d-d068660fe92b-credential-keys\") pod \"f186fd49-86df-4a93-8c3d-d068660fe92b\" (UID: \"f186fd49-86df-4a93-8c3d-d068660fe92b\") " Nov 22 04:26:48 crc kubenswrapper[4699]: I1122 04:26:48.294965 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f186fd49-86df-4a93-8c3d-d068660fe92b-scripts\") pod \"f186fd49-86df-4a93-8c3d-d068660fe92b\" (UID: \"f186fd49-86df-4a93-8c3d-d068660fe92b\") " Nov 22 04:26:48 crc kubenswrapper[4699]: I1122 04:26:48.295013 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czcxb\" (UniqueName: \"kubernetes.io/projected/f186fd49-86df-4a93-8c3d-d068660fe92b-kube-api-access-czcxb\") pod \"f186fd49-86df-4a93-8c3d-d068660fe92b\" (UID: \"f186fd49-86df-4a93-8c3d-d068660fe92b\") " Nov 22 04:26:48 crc kubenswrapper[4699]: I1122 04:26:48.295059 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f186fd49-86df-4a93-8c3d-d068660fe92b-combined-ca-bundle\") pod \"f186fd49-86df-4a93-8c3d-d068660fe92b\" (UID: \"f186fd49-86df-4a93-8c3d-d068660fe92b\") " Nov 22 04:26:48 crc kubenswrapper[4699]: I1122 04:26:48.295109 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f186fd49-86df-4a93-8c3d-d068660fe92b-fernet-keys\") pod \"f186fd49-86df-4a93-8c3d-d068660fe92b\" (UID: \"f186fd49-86df-4a93-8c3d-d068660fe92b\") " Nov 22 04:26:48 crc kubenswrapper[4699]: I1122 04:26:48.295188 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f186fd49-86df-4a93-8c3d-d068660fe92b-config-data\") pod \"f186fd49-86df-4a93-8c3d-d068660fe92b\" (UID: \"f186fd49-86df-4a93-8c3d-d068660fe92b\") " Nov 22 04:26:48 crc kubenswrapper[4699]: I1122 04:26:48.302282 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f186fd49-86df-4a93-8c3d-d068660fe92b-kube-api-access-czcxb" (OuterVolumeSpecName: "kube-api-access-czcxb") pod "f186fd49-86df-4a93-8c3d-d068660fe92b" (UID: "f186fd49-86df-4a93-8c3d-d068660fe92b"). InnerVolumeSpecName "kube-api-access-czcxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:26:48 crc kubenswrapper[4699]: I1122 04:26:48.302815 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f186fd49-86df-4a93-8c3d-d068660fe92b-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "f186fd49-86df-4a93-8c3d-d068660fe92b" (UID: "f186fd49-86df-4a93-8c3d-d068660fe92b"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:26:48 crc kubenswrapper[4699]: I1122 04:26:48.303956 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f186fd49-86df-4a93-8c3d-d068660fe92b-scripts" (OuterVolumeSpecName: "scripts") pod "f186fd49-86df-4a93-8c3d-d068660fe92b" (UID: "f186fd49-86df-4a93-8c3d-d068660fe92b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:26:48 crc kubenswrapper[4699]: I1122 04:26:48.310777 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f186fd49-86df-4a93-8c3d-d068660fe92b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f186fd49-86df-4a93-8c3d-d068660fe92b" (UID: "f186fd49-86df-4a93-8c3d-d068660fe92b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:26:48 crc kubenswrapper[4699]: I1122 04:26:48.327723 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f186fd49-86df-4a93-8c3d-d068660fe92b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f186fd49-86df-4a93-8c3d-d068660fe92b" (UID: "f186fd49-86df-4a93-8c3d-d068660fe92b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:26:48 crc kubenswrapper[4699]: I1122 04:26:48.328629 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f186fd49-86df-4a93-8c3d-d068660fe92b-config-data" (OuterVolumeSpecName: "config-data") pod "f186fd49-86df-4a93-8c3d-d068660fe92b" (UID: "f186fd49-86df-4a93-8c3d-d068660fe92b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:26:48 crc kubenswrapper[4699]: I1122 04:26:48.397193 4699 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f186fd49-86df-4a93-8c3d-d068660fe92b-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:48 crc kubenswrapper[4699]: I1122 04:26:48.397550 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f186fd49-86df-4a93-8c3d-d068660fe92b-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:48 crc kubenswrapper[4699]: I1122 04:26:48.397560 4699 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f186fd49-86df-4a93-8c3d-d068660fe92b-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:48 crc kubenswrapper[4699]: I1122 04:26:48.397572 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f186fd49-86df-4a93-8c3d-d068660fe92b-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:48 crc kubenswrapper[4699]: I1122 04:26:48.397582 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czcxb\" (UniqueName: \"kubernetes.io/projected/f186fd49-86df-4a93-8c3d-d068660fe92b-kube-api-access-czcxb\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:48 crc kubenswrapper[4699]: I1122 04:26:48.397591 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f186fd49-86df-4a93-8c3d-d068660fe92b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:26:48 crc kubenswrapper[4699]: I1122 04:26:48.423153 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-snnx9" event={"ID":"f186fd49-86df-4a93-8c3d-d068660fe92b","Type":"ContainerDied","Data":"89784d4e276d26c1a74df1723882501f8ace5fc33ee23ed8884a5570c795d4bb"} Nov 22 04:26:48 crc kubenswrapper[4699]: I1122 04:26:48.423220 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89784d4e276d26c1a74df1723882501f8ace5fc33ee23ed8884a5570c795d4bb" Nov 22 04:26:48 crc kubenswrapper[4699]: I1122 04:26:48.423189 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-snnx9" Nov 22 04:26:48 crc kubenswrapper[4699]: I1122 04:26:48.944223 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-db-sync-bd6j2"] Nov 22 04:26:48 crc kubenswrapper[4699]: E1122 04:26:48.944605 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cb04579-e9a4-4139-8b94-4ce96f466397" containerName="mariadb-database-create" Nov 22 04:26:48 crc kubenswrapper[4699]: I1122 04:26:48.944621 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cb04579-e9a4-4139-8b94-4ce96f466397" containerName="mariadb-database-create" Nov 22 04:26:48 crc kubenswrapper[4699]: E1122 04:26:48.944629 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cfccecc-0040-47e6-ad18-c905d4a5e097" containerName="init" Nov 22 04:26:48 crc kubenswrapper[4699]: I1122 04:26:48.944636 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cfccecc-0040-47e6-ad18-c905d4a5e097" containerName="init" Nov 22 04:26:48 crc kubenswrapper[4699]: E1122 04:26:48.944651 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f186fd49-86df-4a93-8c3d-d068660fe92b" containerName="keystone-bootstrap" Nov 22 04:26:48 crc kubenswrapper[4699]: I1122 04:26:48.944658 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="f186fd49-86df-4a93-8c3d-d068660fe92b" containerName="keystone-bootstrap" Nov 22 04:26:48 crc kubenswrapper[4699]: E1122 04:26:48.944695 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4040846b-26da-4123-b778-0115b8c5e6da" containerName="mariadb-account-create" Nov 22 04:26:48 crc kubenswrapper[4699]: I1122 04:26:48.944703 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="4040846b-26da-4123-b778-0115b8c5e6da" containerName="mariadb-account-create" Nov 22 04:26:48 crc kubenswrapper[4699]: I1122 04:26:48.944846 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cfccecc-0040-47e6-ad18-c905d4a5e097" containerName="init" Nov 22 04:26:48 crc kubenswrapper[4699]: I1122 04:26:48.944859 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cb04579-e9a4-4139-8b94-4ce96f466397" containerName="mariadb-database-create" Nov 22 04:26:48 crc kubenswrapper[4699]: I1122 04:26:48.944872 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="f186fd49-86df-4a93-8c3d-d068660fe92b" containerName="keystone-bootstrap" Nov 22 04:26:48 crc kubenswrapper[4699]: I1122 04:26:48.944884 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="4040846b-26da-4123-b778-0115b8c5e6da" containerName="mariadb-account-create" Nov 22 04:26:48 crc kubenswrapper[4699]: I1122 04:26:48.945763 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-bd6j2" Nov 22 04:26:48 crc kubenswrapper[4699]: I1122 04:26:48.948119 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-config-data" Nov 22 04:26:48 crc kubenswrapper[4699]: I1122 04:26:48.948379 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 22 04:26:48 crc kubenswrapper[4699]: I1122 04:26:48.948654 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-scripts" Nov 22 04:26:48 crc kubenswrapper[4699]: I1122 04:26:48.948920 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-ironic-dockercfg-vh8nv" Nov 22 04:26:48 crc kubenswrapper[4699]: I1122 04:26:48.956355 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-sync-bd6j2"] Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.108707 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19251598-5cdb-4e4f-9eb7-05cd21d988fb-config-data\") pod \"ironic-db-sync-bd6j2\" (UID: \"19251598-5cdb-4e4f-9eb7-05cd21d988fb\") " pod="openstack/ironic-db-sync-bd6j2" Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.108819 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g72v9\" (UniqueName: \"kubernetes.io/projected/19251598-5cdb-4e4f-9eb7-05cd21d988fb-kube-api-access-g72v9\") pod \"ironic-db-sync-bd6j2\" (UID: \"19251598-5cdb-4e4f-9eb7-05cd21d988fb\") " pod="openstack/ironic-db-sync-bd6j2" Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.108856 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19251598-5cdb-4e4f-9eb7-05cd21d988fb-scripts\") pod \"ironic-db-sync-bd6j2\" (UID: \"19251598-5cdb-4e4f-9eb7-05cd21d988fb\") " pod="openstack/ironic-db-sync-bd6j2" Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.108885 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19251598-5cdb-4e4f-9eb7-05cd21d988fb-combined-ca-bundle\") pod \"ironic-db-sync-bd6j2\" (UID: \"19251598-5cdb-4e4f-9eb7-05cd21d988fb\") " pod="openstack/ironic-db-sync-bd6j2" Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.109089 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/19251598-5cdb-4e4f-9eb7-05cd21d988fb-config-data-merged\") pod \"ironic-db-sync-bd6j2\" (UID: \"19251598-5cdb-4e4f-9eb7-05cd21d988fb\") " pod="openstack/ironic-db-sync-bd6j2" Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.109294 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/19251598-5cdb-4e4f-9eb7-05cd21d988fb-etc-podinfo\") pod \"ironic-db-sync-bd6j2\" (UID: \"19251598-5cdb-4e4f-9eb7-05cd21d988fb\") " pod="openstack/ironic-db-sync-bd6j2" Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.208797 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-snnx9"] Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.211235 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g72v9\" (UniqueName: \"kubernetes.io/projected/19251598-5cdb-4e4f-9eb7-05cd21d988fb-kube-api-access-g72v9\") pod \"ironic-db-sync-bd6j2\" (UID: \"19251598-5cdb-4e4f-9eb7-05cd21d988fb\") " pod="openstack/ironic-db-sync-bd6j2" Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.211285 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19251598-5cdb-4e4f-9eb7-05cd21d988fb-scripts\") pod \"ironic-db-sync-bd6j2\" (UID: \"19251598-5cdb-4e4f-9eb7-05cd21d988fb\") " pod="openstack/ironic-db-sync-bd6j2" Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.211318 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19251598-5cdb-4e4f-9eb7-05cd21d988fb-combined-ca-bundle\") pod \"ironic-db-sync-bd6j2\" (UID: \"19251598-5cdb-4e4f-9eb7-05cd21d988fb\") " pod="openstack/ironic-db-sync-bd6j2" Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.211404 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/19251598-5cdb-4e4f-9eb7-05cd21d988fb-config-data-merged\") pod \"ironic-db-sync-bd6j2\" (UID: \"19251598-5cdb-4e4f-9eb7-05cd21d988fb\") " pod="openstack/ironic-db-sync-bd6j2" Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.211477 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/19251598-5cdb-4e4f-9eb7-05cd21d988fb-etc-podinfo\") pod \"ironic-db-sync-bd6j2\" (UID: \"19251598-5cdb-4e4f-9eb7-05cd21d988fb\") " pod="openstack/ironic-db-sync-bd6j2" Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.211517 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19251598-5cdb-4e4f-9eb7-05cd21d988fb-config-data\") pod \"ironic-db-sync-bd6j2\" (UID: \"19251598-5cdb-4e4f-9eb7-05cd21d988fb\") " pod="openstack/ironic-db-sync-bd6j2" Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.215278 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/19251598-5cdb-4e4f-9eb7-05cd21d988fb-config-data-merged\") pod \"ironic-db-sync-bd6j2\" (UID: \"19251598-5cdb-4e4f-9eb7-05cd21d988fb\") " pod="openstack/ironic-db-sync-bd6j2" Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.215968 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19251598-5cdb-4e4f-9eb7-05cd21d988fb-config-data\") pod \"ironic-db-sync-bd6j2\" (UID: \"19251598-5cdb-4e4f-9eb7-05cd21d988fb\") " pod="openstack/ironic-db-sync-bd6j2" Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.216414 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19251598-5cdb-4e4f-9eb7-05cd21d988fb-combined-ca-bundle\") pod \"ironic-db-sync-bd6j2\" (UID: \"19251598-5cdb-4e4f-9eb7-05cd21d988fb\") " pod="openstack/ironic-db-sync-bd6j2" Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.217047 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19251598-5cdb-4e4f-9eb7-05cd21d988fb-scripts\") pod \"ironic-db-sync-bd6j2\" (UID: \"19251598-5cdb-4e4f-9eb7-05cd21d988fb\") " pod="openstack/ironic-db-sync-bd6j2" Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.217855 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/19251598-5cdb-4e4f-9eb7-05cd21d988fb-etc-podinfo\") pod \"ironic-db-sync-bd6j2\" (UID: \"19251598-5cdb-4e4f-9eb7-05cd21d988fb\") " pod="openstack/ironic-db-sync-bd6j2" Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.217982 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-snnx9"] Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.228996 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g72v9\" (UniqueName: \"kubernetes.io/projected/19251598-5cdb-4e4f-9eb7-05cd21d988fb-kube-api-access-g72v9\") pod \"ironic-db-sync-bd6j2\" (UID: \"19251598-5cdb-4e4f-9eb7-05cd21d988fb\") " pod="openstack/ironic-db-sync-bd6j2" Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.272903 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-bd6j2" Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.305363 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-5bqjq"] Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.306477 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5bqjq" Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.315772 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.315908 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.315942 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-qfldf" Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.317665 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.321022 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-5bqjq"] Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.391746 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-6dplx" Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.419837 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae96d89d-006a-4e7c-a42b-916cc7c77d19-combined-ca-bundle\") pod \"keystone-bootstrap-5bqjq\" (UID: \"ae96d89d-006a-4e7c-a42b-916cc7c77d19\") " pod="openstack/keystone-bootstrap-5bqjq" Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.419884 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae96d89d-006a-4e7c-a42b-916cc7c77d19-scripts\") pod \"keystone-bootstrap-5bqjq\" (UID: \"ae96d89d-006a-4e7c-a42b-916cc7c77d19\") " pod="openstack/keystone-bootstrap-5bqjq" Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.419921 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz8mc\" (UniqueName: \"kubernetes.io/projected/ae96d89d-006a-4e7c-a42b-916cc7c77d19-kube-api-access-dz8mc\") pod \"keystone-bootstrap-5bqjq\" (UID: \"ae96d89d-006a-4e7c-a42b-916cc7c77d19\") " pod="openstack/keystone-bootstrap-5bqjq" Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.419954 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae96d89d-006a-4e7c-a42b-916cc7c77d19-config-data\") pod \"keystone-bootstrap-5bqjq\" (UID: \"ae96d89d-006a-4e7c-a42b-916cc7c77d19\") " pod="openstack/keystone-bootstrap-5bqjq" Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.419989 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ae96d89d-006a-4e7c-a42b-916cc7c77d19-credential-keys\") pod \"keystone-bootstrap-5bqjq\" (UID: \"ae96d89d-006a-4e7c-a42b-916cc7c77d19\") " pod="openstack/keystone-bootstrap-5bqjq" Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.420039 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ae96d89d-006a-4e7c-a42b-916cc7c77d19-fernet-keys\") pod \"keystone-bootstrap-5bqjq\" (UID: \"ae96d89d-006a-4e7c-a42b-916cc7c77d19\") " pod="openstack/keystone-bootstrap-5bqjq" Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.481970 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f186fd49-86df-4a93-8c3d-d068660fe92b" path="/var/lib/kubelet/pods/f186fd49-86df-4a93-8c3d-d068660fe92b/volumes" Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.482939 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-x5pmx"] Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.483259 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74f6bcbc87-x5pmx" podUID="417b0282-cef1-4a7c-aca5-593297254fe3" containerName="dnsmasq-dns" containerID="cri-o://a1a5b0166aeaff548a75355e9677f3747a6bae4fbd895a37d195f1a79b30ca91" gracePeriod=10 Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.521543 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae96d89d-006a-4e7c-a42b-916cc7c77d19-config-data\") pod \"keystone-bootstrap-5bqjq\" (UID: \"ae96d89d-006a-4e7c-a42b-916cc7c77d19\") " pod="openstack/keystone-bootstrap-5bqjq" Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.521607 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ae96d89d-006a-4e7c-a42b-916cc7c77d19-credential-keys\") pod \"keystone-bootstrap-5bqjq\" (UID: \"ae96d89d-006a-4e7c-a42b-916cc7c77d19\") " pod="openstack/keystone-bootstrap-5bqjq" Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.521773 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ae96d89d-006a-4e7c-a42b-916cc7c77d19-fernet-keys\") pod \"keystone-bootstrap-5bqjq\" (UID: \"ae96d89d-006a-4e7c-a42b-916cc7c77d19\") " pod="openstack/keystone-bootstrap-5bqjq" Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.521889 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae96d89d-006a-4e7c-a42b-916cc7c77d19-combined-ca-bundle\") pod \"keystone-bootstrap-5bqjq\" (UID: \"ae96d89d-006a-4e7c-a42b-916cc7c77d19\") " pod="openstack/keystone-bootstrap-5bqjq" Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.521914 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae96d89d-006a-4e7c-a42b-916cc7c77d19-scripts\") pod \"keystone-bootstrap-5bqjq\" (UID: \"ae96d89d-006a-4e7c-a42b-916cc7c77d19\") " pod="openstack/keystone-bootstrap-5bqjq" Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.521950 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dz8mc\" (UniqueName: \"kubernetes.io/projected/ae96d89d-006a-4e7c-a42b-916cc7c77d19-kube-api-access-dz8mc\") pod \"keystone-bootstrap-5bqjq\" (UID: \"ae96d89d-006a-4e7c-a42b-916cc7c77d19\") " pod="openstack/keystone-bootstrap-5bqjq" Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.527717 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae96d89d-006a-4e7c-a42b-916cc7c77d19-scripts\") pod \"keystone-bootstrap-5bqjq\" (UID: \"ae96d89d-006a-4e7c-a42b-916cc7c77d19\") " pod="openstack/keystone-bootstrap-5bqjq" Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.528550 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ae96d89d-006a-4e7c-a42b-916cc7c77d19-credential-keys\") pod \"keystone-bootstrap-5bqjq\" (UID: \"ae96d89d-006a-4e7c-a42b-916cc7c77d19\") " pod="openstack/keystone-bootstrap-5bqjq" Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.528633 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae96d89d-006a-4e7c-a42b-916cc7c77d19-config-data\") pod \"keystone-bootstrap-5bqjq\" (UID: \"ae96d89d-006a-4e7c-a42b-916cc7c77d19\") " pod="openstack/keystone-bootstrap-5bqjq" Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.538011 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ae96d89d-006a-4e7c-a42b-916cc7c77d19-fernet-keys\") pod \"keystone-bootstrap-5bqjq\" (UID: \"ae96d89d-006a-4e7c-a42b-916cc7c77d19\") " pod="openstack/keystone-bootstrap-5bqjq" Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.539359 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz8mc\" (UniqueName: \"kubernetes.io/projected/ae96d89d-006a-4e7c-a42b-916cc7c77d19-kube-api-access-dz8mc\") pod \"keystone-bootstrap-5bqjq\" (UID: \"ae96d89d-006a-4e7c-a42b-916cc7c77d19\") " pod="openstack/keystone-bootstrap-5bqjq" Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.540253 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae96d89d-006a-4e7c-a42b-916cc7c77d19-combined-ca-bundle\") pod \"keystone-bootstrap-5bqjq\" (UID: \"ae96d89d-006a-4e7c-a42b-916cc7c77d19\") " pod="openstack/keystone-bootstrap-5bqjq" Nov 22 04:26:49 crc kubenswrapper[4699]: I1122 04:26:49.646609 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5bqjq" Nov 22 04:26:50 crc kubenswrapper[4699]: I1122 04:26:50.532226 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-x5pmx" podUID="417b0282-cef1-4a7c-aca5-593297254fe3" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: connect: connection refused" Nov 22 04:26:55 crc kubenswrapper[4699]: I1122 04:26:55.531363 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-x5pmx" podUID="417b0282-cef1-4a7c-aca5-593297254fe3" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: connect: connection refused" Nov 22 04:26:56 crc kubenswrapper[4699]: I1122 04:26:56.490522 4699 generic.go:334] "Generic (PLEG): container finished" podID="417b0282-cef1-4a7c-aca5-593297254fe3" containerID="a1a5b0166aeaff548a75355e9677f3747a6bae4fbd895a37d195f1a79b30ca91" exitCode=0 Nov 22 04:26:56 crc kubenswrapper[4699]: I1122 04:26:56.490574 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-x5pmx" event={"ID":"417b0282-cef1-4a7c-aca5-593297254fe3","Type":"ContainerDied","Data":"a1a5b0166aeaff548a75355e9677f3747a6bae4fbd895a37d195f1a79b30ca91"} Nov 22 04:27:00 crc kubenswrapper[4699]: I1122 04:27:00.532155 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-x5pmx" podUID="417b0282-cef1-4a7c-aca5-593297254fe3" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: connect: connection refused" Nov 22 04:27:00 crc kubenswrapper[4699]: I1122 04:27:00.532795 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-x5pmx" Nov 22 04:27:05 crc kubenswrapper[4699]: I1122 04:27:05.531981 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-x5pmx" podUID="417b0282-cef1-4a7c-aca5-593297254fe3" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: connect: connection refused" Nov 22 04:27:11 crc kubenswrapper[4699]: I1122 04:27:10.531423 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-x5pmx" podUID="417b0282-cef1-4a7c-aca5-593297254fe3" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: connect: connection refused" Nov 22 04:27:15 crc kubenswrapper[4699]: I1122 04:27:15.531947 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-x5pmx" podUID="417b0282-cef1-4a7c-aca5-593297254fe3" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: connect: connection refused" Nov 22 04:27:25 crc kubenswrapper[4699]: I1122 04:27:25.532471 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-x5pmx" podUID="417b0282-cef1-4a7c-aca5-593297254fe3" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: i/o timeout" Nov 22 04:27:28 crc kubenswrapper[4699]: E1122 04:27:28.656928 4699 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Nov 22 04:27:28 crc kubenswrapper[4699]: E1122 04:27:28.657635 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qljtl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-mf25h_openstack(a79a788b-1b1c-45df-9c90-3c30d382691b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 04:27:28 crc kubenswrapper[4699]: E1122 04:27:28.658826 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-mf25h" podUID="a79a788b-1b1c-45df-9c90-3c30d382691b" Nov 22 04:27:28 crc kubenswrapper[4699]: I1122 04:27:28.789741 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-x5pmx" event={"ID":"417b0282-cef1-4a7c-aca5-593297254fe3","Type":"ContainerDied","Data":"d9be3bb7fa7a0781dffe600d9a6395a984bfdc0273d8e5fcae602ef27fcca7e9"} Nov 22 04:27:28 crc kubenswrapper[4699]: I1122 04:27:28.789809 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9be3bb7fa7a0781dffe600d9a6395a984bfdc0273d8e5fcae602ef27fcca7e9" Nov 22 04:27:28 crc kubenswrapper[4699]: E1122 04:27:28.791347 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-mf25h" podUID="a79a788b-1b1c-45df-9c90-3c30d382691b" Nov 22 04:27:28 crc kubenswrapper[4699]: I1122 04:27:28.881547 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-x5pmx" Nov 22 04:27:29 crc kubenswrapper[4699]: I1122 04:27:29.017732 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/417b0282-cef1-4a7c-aca5-593297254fe3-dns-swift-storage-0\") pod \"417b0282-cef1-4a7c-aca5-593297254fe3\" (UID: \"417b0282-cef1-4a7c-aca5-593297254fe3\") " Nov 22 04:27:29 crc kubenswrapper[4699]: I1122 04:27:29.017809 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/417b0282-cef1-4a7c-aca5-593297254fe3-ovsdbserver-sb\") pod \"417b0282-cef1-4a7c-aca5-593297254fe3\" (UID: \"417b0282-cef1-4a7c-aca5-593297254fe3\") " Nov 22 04:27:29 crc kubenswrapper[4699]: I1122 04:27:29.017834 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/417b0282-cef1-4a7c-aca5-593297254fe3-ovsdbserver-nb\") pod \"417b0282-cef1-4a7c-aca5-593297254fe3\" (UID: \"417b0282-cef1-4a7c-aca5-593297254fe3\") " Nov 22 04:27:29 crc kubenswrapper[4699]: I1122 04:27:29.017897 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/417b0282-cef1-4a7c-aca5-593297254fe3-dns-svc\") pod \"417b0282-cef1-4a7c-aca5-593297254fe3\" (UID: \"417b0282-cef1-4a7c-aca5-593297254fe3\") " Nov 22 04:27:29 crc kubenswrapper[4699]: I1122 04:27:29.018041 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/417b0282-cef1-4a7c-aca5-593297254fe3-config\") pod \"417b0282-cef1-4a7c-aca5-593297254fe3\" (UID: \"417b0282-cef1-4a7c-aca5-593297254fe3\") " Nov 22 04:27:29 crc kubenswrapper[4699]: I1122 04:27:29.018102 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnzsj\" (UniqueName: \"kubernetes.io/projected/417b0282-cef1-4a7c-aca5-593297254fe3-kube-api-access-pnzsj\") pod \"417b0282-cef1-4a7c-aca5-593297254fe3\" (UID: \"417b0282-cef1-4a7c-aca5-593297254fe3\") " Nov 22 04:27:29 crc kubenswrapper[4699]: I1122 04:27:29.023703 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/417b0282-cef1-4a7c-aca5-593297254fe3-kube-api-access-pnzsj" (OuterVolumeSpecName: "kube-api-access-pnzsj") pod "417b0282-cef1-4a7c-aca5-593297254fe3" (UID: "417b0282-cef1-4a7c-aca5-593297254fe3"). InnerVolumeSpecName "kube-api-access-pnzsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:27:29 crc kubenswrapper[4699]: I1122 04:27:29.063926 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/417b0282-cef1-4a7c-aca5-593297254fe3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "417b0282-cef1-4a7c-aca5-593297254fe3" (UID: "417b0282-cef1-4a7c-aca5-593297254fe3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:27:29 crc kubenswrapper[4699]: I1122 04:27:29.068203 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/417b0282-cef1-4a7c-aca5-593297254fe3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "417b0282-cef1-4a7c-aca5-593297254fe3" (UID: "417b0282-cef1-4a7c-aca5-593297254fe3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:27:29 crc kubenswrapper[4699]: I1122 04:27:29.070178 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/417b0282-cef1-4a7c-aca5-593297254fe3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "417b0282-cef1-4a7c-aca5-593297254fe3" (UID: "417b0282-cef1-4a7c-aca5-593297254fe3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:27:29 crc kubenswrapper[4699]: I1122 04:27:29.072027 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/417b0282-cef1-4a7c-aca5-593297254fe3-config" (OuterVolumeSpecName: "config") pod "417b0282-cef1-4a7c-aca5-593297254fe3" (UID: "417b0282-cef1-4a7c-aca5-593297254fe3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:27:29 crc kubenswrapper[4699]: I1122 04:27:29.077675 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/417b0282-cef1-4a7c-aca5-593297254fe3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "417b0282-cef1-4a7c-aca5-593297254fe3" (UID: "417b0282-cef1-4a7c-aca5-593297254fe3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:27:29 crc kubenswrapper[4699]: I1122 04:27:29.119934 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/417b0282-cef1-4a7c-aca5-593297254fe3-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:27:29 crc kubenswrapper[4699]: I1122 04:27:29.119973 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnzsj\" (UniqueName: \"kubernetes.io/projected/417b0282-cef1-4a7c-aca5-593297254fe3-kube-api-access-pnzsj\") on node \"crc\" DevicePath \"\"" Nov 22 04:27:29 crc kubenswrapper[4699]: I1122 04:27:29.119989 4699 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/417b0282-cef1-4a7c-aca5-593297254fe3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 22 04:27:29 crc kubenswrapper[4699]: I1122 04:27:29.120001 4699 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/417b0282-cef1-4a7c-aca5-593297254fe3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 04:27:29 crc kubenswrapper[4699]: I1122 04:27:29.120012 4699 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/417b0282-cef1-4a7c-aca5-593297254fe3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 04:27:29 crc kubenswrapper[4699]: I1122 04:27:29.120024 4699 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/417b0282-cef1-4a7c-aca5-593297254fe3-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 04:27:29 crc kubenswrapper[4699]: E1122 04:27:29.606979 4699 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Nov 22 04:27:29 crc kubenswrapper[4699]: E1122 04:27:29.607310 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xzdkb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-fgx2c_openstack(a2442edb-5370-4fd9-af87-6cb17498cee6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 04:27:29 crc kubenswrapper[4699]: E1122 04:27:29.608720 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-fgx2c" podUID="a2442edb-5370-4fd9-af87-6cb17498cee6" Nov 22 04:27:29 crc kubenswrapper[4699]: I1122 04:27:29.798792 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-x5pmx" Nov 22 04:27:29 crc kubenswrapper[4699]: E1122 04:27:29.800915 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-fgx2c" podUID="a2442edb-5370-4fd9-af87-6cb17498cee6" Nov 22 04:27:29 crc kubenswrapper[4699]: I1122 04:27:29.844220 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-x5pmx"] Nov 22 04:27:29 crc kubenswrapper[4699]: I1122 04:27:29.853356 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-x5pmx"] Nov 22 04:27:30 crc kubenswrapper[4699]: I1122 04:27:30.534253 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-x5pmx" podUID="417b0282-cef1-4a7c-aca5-593297254fe3" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: i/o timeout" Nov 22 04:27:30 crc kubenswrapper[4699]: E1122 04:27:30.536551 4699 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Nov 22 04:27:30 crc kubenswrapper[4699]: E1122 04:27:30.536903 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-29zst,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-zb5vb_openstack(3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 04:27:30 crc kubenswrapper[4699]: E1122 04:27:30.538650 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-zb5vb" podUID="3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b" Nov 22 04:27:30 crc kubenswrapper[4699]: I1122 04:27:30.811882 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"058a0faa-f3ce-4c0e-b7f0-e3f915f5d887","Type":"ContainerStarted","Data":"9134c741b6b4714ede5a4c6025c9bb45805c0e6438ba8f14898d5b13c7824107"} Nov 22 04:27:30 crc kubenswrapper[4699]: E1122 04:27:30.814607 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-zb5vb" podUID="3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b" Nov 22 04:27:31 crc kubenswrapper[4699]: I1122 04:27:31.042387 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-sync-bd6j2"] Nov 22 04:27:31 crc kubenswrapper[4699]: I1122 04:27:31.054818 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-5bqjq"] Nov 22 04:27:31 crc kubenswrapper[4699]: W1122 04:27:31.076933 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae96d89d_006a_4e7c_a42b_916cc7c77d19.slice/crio-146b31f5e674ad70d115ed162deb0ed3f1020909d819ee65aefb65627b397e9d WatchSource:0}: Error finding container 146b31f5e674ad70d115ed162deb0ed3f1020909d819ee65aefb65627b397e9d: Status 404 returned error can't find the container with id 146b31f5e674ad70d115ed162deb0ed3f1020909d819ee65aefb65627b397e9d Nov 22 04:27:31 crc kubenswrapper[4699]: W1122 04:27:31.077314 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19251598_5cdb_4e4f_9eb7_05cd21d988fb.slice/crio-bd6551f6180a26a2eee5977aa4ad4db45b42b1567ba302716c93ba3944bb0179 WatchSource:0}: Error finding container bd6551f6180a26a2eee5977aa4ad4db45b42b1567ba302716c93ba3944bb0179: Status 404 returned error can't find the container with id bd6551f6180a26a2eee5977aa4ad4db45b42b1567ba302716c93ba3944bb0179 Nov 22 04:27:31 crc kubenswrapper[4699]: I1122 04:27:31.082484 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 22 04:27:31 crc kubenswrapper[4699]: I1122 04:27:31.465894 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="417b0282-cef1-4a7c-aca5-593297254fe3" path="/var/lib/kubelet/pods/417b0282-cef1-4a7c-aca5-593297254fe3/volumes" Nov 22 04:27:31 crc kubenswrapper[4699]: I1122 04:27:31.828742 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"06d61fb1-732e-4e73-a859-b94c63838a8a","Type":"ContainerStarted","Data":"e70ff67d7560be6c98247f26b9d4bbf192951b4117d07dca46e5a6e531143047"} Nov 22 04:27:31 crc kubenswrapper[4699]: I1122 04:27:31.828894 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="06d61fb1-732e-4e73-a859-b94c63838a8a" containerName="glance-log" containerID="cri-o://7c129a5b76f2f8ffc6901291e9e5f0b872b96d5cfb2ea850f76aaeeb877607e1" gracePeriod=30 Nov 22 04:27:31 crc kubenswrapper[4699]: I1122 04:27:31.829391 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="06d61fb1-732e-4e73-a859-b94c63838a8a" containerName="glance-httpd" containerID="cri-o://e70ff67d7560be6c98247f26b9d4bbf192951b4117d07dca46e5a6e531143047" gracePeriod=30 Nov 22 04:27:31 crc kubenswrapper[4699]: I1122 04:27:31.838792 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5bqjq" event={"ID":"ae96d89d-006a-4e7c-a42b-916cc7c77d19","Type":"ContainerStarted","Data":"ed5ad378ee9c82aaf3a68b092e4b798ba21ae9f666d69475f0a13e6d97a8c3c2"} Nov 22 04:27:31 crc kubenswrapper[4699]: I1122 04:27:31.838839 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5bqjq" event={"ID":"ae96d89d-006a-4e7c-a42b-916cc7c77d19","Type":"ContainerStarted","Data":"146b31f5e674ad70d115ed162deb0ed3f1020909d819ee65aefb65627b397e9d"} Nov 22 04:27:31 crc kubenswrapper[4699]: I1122 04:27:31.840548 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-bd6j2" event={"ID":"19251598-5cdb-4e4f-9eb7-05cd21d988fb","Type":"ContainerStarted","Data":"bd6551f6180a26a2eee5977aa4ad4db45b42b1567ba302716c93ba3944bb0179"} Nov 22 04:27:31 crc kubenswrapper[4699]: I1122 04:27:31.848168 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"00e712f4-ba35-46e1-8ff2-3e1eb7615e69","Type":"ContainerStarted","Data":"7d52006815d14f4743a48ab99dff4d0910d13e08ba6ba37a93619bb058465bae"} Nov 22 04:27:31 crc kubenswrapper[4699]: I1122 04:27:31.848223 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"00e712f4-ba35-46e1-8ff2-3e1eb7615e69","Type":"ContainerStarted","Data":"3e2f6000bff516f9a22e208f6a7f2728e74a0b89740f082fefb122194ac10503"} Nov 22 04:27:31 crc kubenswrapper[4699]: I1122 04:27:31.848417 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="00e712f4-ba35-46e1-8ff2-3e1eb7615e69" containerName="glance-log" containerID="cri-o://3e2f6000bff516f9a22e208f6a7f2728e74a0b89740f082fefb122194ac10503" gracePeriod=30 Nov 22 04:27:31 crc kubenswrapper[4699]: I1122 04:27:31.848789 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="00e712f4-ba35-46e1-8ff2-3e1eb7615e69" containerName="glance-httpd" containerID="cri-o://7d52006815d14f4743a48ab99dff4d0910d13e08ba6ba37a93619bb058465bae" gracePeriod=30 Nov 22 04:27:31 crc kubenswrapper[4699]: I1122 04:27:31.861974 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=58.86195779 podStartE2EDuration="58.86195779s" podCreationTimestamp="2025-11-22 04:26:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:27:31.850745948 +0000 UTC m=+1203.193367165" watchObservedRunningTime="2025-11-22 04:27:31.86195779 +0000 UTC m=+1203.204578977" Nov 22 04:27:31 crc kubenswrapper[4699]: I1122 04:27:31.878236 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-5bqjq" podStartSLOduration=42.878216554 podStartE2EDuration="42.878216554s" podCreationTimestamp="2025-11-22 04:26:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:27:31.871790759 +0000 UTC m=+1203.214411956" watchObservedRunningTime="2025-11-22 04:27:31.878216554 +0000 UTC m=+1203.220837751" Nov 22 04:27:31 crc kubenswrapper[4699]: I1122 04:27:31.896892 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=58.896873637 podStartE2EDuration="58.896873637s" podCreationTimestamp="2025-11-22 04:26:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:27:31.891775973 +0000 UTC m=+1203.234397180" watchObservedRunningTime="2025-11-22 04:27:31.896873637 +0000 UTC m=+1203.239494844" Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.680557 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.730958 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.831118 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cx665\" (UniqueName: \"kubernetes.io/projected/00e712f4-ba35-46e1-8ff2-3e1eb7615e69-kube-api-access-cx665\") pod \"00e712f4-ba35-46e1-8ff2-3e1eb7615e69\" (UID: \"00e712f4-ba35-46e1-8ff2-3e1eb7615e69\") " Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.831265 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00e712f4-ba35-46e1-8ff2-3e1eb7615e69-scripts\") pod \"00e712f4-ba35-46e1-8ff2-3e1eb7615e69\" (UID: \"00e712f4-ba35-46e1-8ff2-3e1eb7615e69\") " Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.831322 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/00e712f4-ba35-46e1-8ff2-3e1eb7615e69-public-tls-certs\") pod \"00e712f4-ba35-46e1-8ff2-3e1eb7615e69\" (UID: \"00e712f4-ba35-46e1-8ff2-3e1eb7615e69\") " Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.831352 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"00e712f4-ba35-46e1-8ff2-3e1eb7615e69\" (UID: \"00e712f4-ba35-46e1-8ff2-3e1eb7615e69\") " Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.831391 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00e712f4-ba35-46e1-8ff2-3e1eb7615e69-logs\") pod \"00e712f4-ba35-46e1-8ff2-3e1eb7615e69\" (UID: \"00e712f4-ba35-46e1-8ff2-3e1eb7615e69\") " Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.831422 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00e712f4-ba35-46e1-8ff2-3e1eb7615e69-config-data\") pod \"00e712f4-ba35-46e1-8ff2-3e1eb7615e69\" (UID: \"00e712f4-ba35-46e1-8ff2-3e1eb7615e69\") " Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.831468 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/00e712f4-ba35-46e1-8ff2-3e1eb7615e69-httpd-run\") pod \"00e712f4-ba35-46e1-8ff2-3e1eb7615e69\" (UID: \"00e712f4-ba35-46e1-8ff2-3e1eb7615e69\") " Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.831562 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00e712f4-ba35-46e1-8ff2-3e1eb7615e69-combined-ca-bundle\") pod \"00e712f4-ba35-46e1-8ff2-3e1eb7615e69\" (UID: \"00e712f4-ba35-46e1-8ff2-3e1eb7615e69\") " Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.832596 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00e712f4-ba35-46e1-8ff2-3e1eb7615e69-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "00e712f4-ba35-46e1-8ff2-3e1eb7615e69" (UID: "00e712f4-ba35-46e1-8ff2-3e1eb7615e69"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.832767 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00e712f4-ba35-46e1-8ff2-3e1eb7615e69-logs" (OuterVolumeSpecName: "logs") pod "00e712f4-ba35-46e1-8ff2-3e1eb7615e69" (UID: "00e712f4-ba35-46e1-8ff2-3e1eb7615e69"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.836738 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00e712f4-ba35-46e1-8ff2-3e1eb7615e69-kube-api-access-cx665" (OuterVolumeSpecName: "kube-api-access-cx665") pod "00e712f4-ba35-46e1-8ff2-3e1eb7615e69" (UID: "00e712f4-ba35-46e1-8ff2-3e1eb7615e69"). InnerVolumeSpecName "kube-api-access-cx665". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.837504 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "00e712f4-ba35-46e1-8ff2-3e1eb7615e69" (UID: "00e712f4-ba35-46e1-8ff2-3e1eb7615e69"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.838475 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00e712f4-ba35-46e1-8ff2-3e1eb7615e69-scripts" (OuterVolumeSpecName: "scripts") pod "00e712f4-ba35-46e1-8ff2-3e1eb7615e69" (UID: "00e712f4-ba35-46e1-8ff2-3e1eb7615e69"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.862917 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00e712f4-ba35-46e1-8ff2-3e1eb7615e69-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "00e712f4-ba35-46e1-8ff2-3e1eb7615e69" (UID: "00e712f4-ba35-46e1-8ff2-3e1eb7615e69"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.865224 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"058a0faa-f3ce-4c0e-b7f0-e3f915f5d887","Type":"ContainerStarted","Data":"d21f1d725f135279f9e93a053c515bdc28bb1fdf2ccefb2b4ce030a5416640ad"} Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.868917 4699 generic.go:334] "Generic (PLEG): container finished" podID="00e712f4-ba35-46e1-8ff2-3e1eb7615e69" containerID="7d52006815d14f4743a48ab99dff4d0910d13e08ba6ba37a93619bb058465bae" exitCode=143 Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.868947 4699 generic.go:334] "Generic (PLEG): container finished" podID="00e712f4-ba35-46e1-8ff2-3e1eb7615e69" containerID="3e2f6000bff516f9a22e208f6a7f2728e74a0b89740f082fefb122194ac10503" exitCode=143 Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.869005 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"00e712f4-ba35-46e1-8ff2-3e1eb7615e69","Type":"ContainerDied","Data":"7d52006815d14f4743a48ab99dff4d0910d13e08ba6ba37a93619bb058465bae"} Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.869033 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"00e712f4-ba35-46e1-8ff2-3e1eb7615e69","Type":"ContainerDied","Data":"3e2f6000bff516f9a22e208f6a7f2728e74a0b89740f082fefb122194ac10503"} Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.869046 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"00e712f4-ba35-46e1-8ff2-3e1eb7615e69","Type":"ContainerDied","Data":"9f331d1ae4c1f9bd39ad77ec1b032f47c0488d65f9131d02d701da0a68172542"} Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.869063 4699 scope.go:117] "RemoveContainer" containerID="7d52006815d14f4743a48ab99dff4d0910d13e08ba6ba37a93619bb058465bae" Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.869092 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.875907 4699 generic.go:334] "Generic (PLEG): container finished" podID="06d61fb1-732e-4e73-a859-b94c63838a8a" containerID="e70ff67d7560be6c98247f26b9d4bbf192951b4117d07dca46e5a6e531143047" exitCode=0 Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.875948 4699 generic.go:334] "Generic (PLEG): container finished" podID="06d61fb1-732e-4e73-a859-b94c63838a8a" containerID="7c129a5b76f2f8ffc6901291e9e5f0b872b96d5cfb2ea850f76aaeeb877607e1" exitCode=143 Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.876896 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.877622 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"06d61fb1-732e-4e73-a859-b94c63838a8a","Type":"ContainerDied","Data":"e70ff67d7560be6c98247f26b9d4bbf192951b4117d07dca46e5a6e531143047"} Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.877727 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"06d61fb1-732e-4e73-a859-b94c63838a8a","Type":"ContainerDied","Data":"7c129a5b76f2f8ffc6901291e9e5f0b872b96d5cfb2ea850f76aaeeb877607e1"} Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.877788 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"06d61fb1-732e-4e73-a859-b94c63838a8a","Type":"ContainerDied","Data":"e0f87c3418e69470ea3eeebe9da556893834dd0b165d44566c95b126f3d51887"} Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.883585 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00e712f4-ba35-46e1-8ff2-3e1eb7615e69-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "00e712f4-ba35-46e1-8ff2-3e1eb7615e69" (UID: "00e712f4-ba35-46e1-8ff2-3e1eb7615e69"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.894282 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00e712f4-ba35-46e1-8ff2-3e1eb7615e69-config-data" (OuterVolumeSpecName: "config-data") pod "00e712f4-ba35-46e1-8ff2-3e1eb7615e69" (UID: "00e712f4-ba35-46e1-8ff2-3e1eb7615e69"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.897479 4699 scope.go:117] "RemoveContainer" containerID="3e2f6000bff516f9a22e208f6a7f2728e74a0b89740f082fefb122194ac10503" Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.923155 4699 scope.go:117] "RemoveContainer" containerID="7d52006815d14f4743a48ab99dff4d0910d13e08ba6ba37a93619bb058465bae" Nov 22 04:27:32 crc kubenswrapper[4699]: E1122 04:27:32.923749 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d52006815d14f4743a48ab99dff4d0910d13e08ba6ba37a93619bb058465bae\": container with ID starting with 7d52006815d14f4743a48ab99dff4d0910d13e08ba6ba37a93619bb058465bae not found: ID does not exist" containerID="7d52006815d14f4743a48ab99dff4d0910d13e08ba6ba37a93619bb058465bae" Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.923805 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d52006815d14f4743a48ab99dff4d0910d13e08ba6ba37a93619bb058465bae"} err="failed to get container status \"7d52006815d14f4743a48ab99dff4d0910d13e08ba6ba37a93619bb058465bae\": rpc error: code = NotFound desc = could not find container \"7d52006815d14f4743a48ab99dff4d0910d13e08ba6ba37a93619bb058465bae\": container with ID starting with 7d52006815d14f4743a48ab99dff4d0910d13e08ba6ba37a93619bb058465bae not found: ID does not exist" Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.923829 4699 scope.go:117] "RemoveContainer" containerID="3e2f6000bff516f9a22e208f6a7f2728e74a0b89740f082fefb122194ac10503" Nov 22 04:27:32 crc kubenswrapper[4699]: E1122 04:27:32.924312 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e2f6000bff516f9a22e208f6a7f2728e74a0b89740f082fefb122194ac10503\": container with ID starting with 3e2f6000bff516f9a22e208f6a7f2728e74a0b89740f082fefb122194ac10503 not found: ID does not exist" containerID="3e2f6000bff516f9a22e208f6a7f2728e74a0b89740f082fefb122194ac10503" Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.924359 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e2f6000bff516f9a22e208f6a7f2728e74a0b89740f082fefb122194ac10503"} err="failed to get container status \"3e2f6000bff516f9a22e208f6a7f2728e74a0b89740f082fefb122194ac10503\": rpc error: code = NotFound desc = could not find container \"3e2f6000bff516f9a22e208f6a7f2728e74a0b89740f082fefb122194ac10503\": container with ID starting with 3e2f6000bff516f9a22e208f6a7f2728e74a0b89740f082fefb122194ac10503 not found: ID does not exist" Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.924426 4699 scope.go:117] "RemoveContainer" containerID="7d52006815d14f4743a48ab99dff4d0910d13e08ba6ba37a93619bb058465bae" Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.925106 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d52006815d14f4743a48ab99dff4d0910d13e08ba6ba37a93619bb058465bae"} err="failed to get container status \"7d52006815d14f4743a48ab99dff4d0910d13e08ba6ba37a93619bb058465bae\": rpc error: code = NotFound desc = could not find container \"7d52006815d14f4743a48ab99dff4d0910d13e08ba6ba37a93619bb058465bae\": container with ID starting with 7d52006815d14f4743a48ab99dff4d0910d13e08ba6ba37a93619bb058465bae not found: ID does not exist" Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.925129 4699 scope.go:117] "RemoveContainer" containerID="3e2f6000bff516f9a22e208f6a7f2728e74a0b89740f082fefb122194ac10503" Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.925542 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e2f6000bff516f9a22e208f6a7f2728e74a0b89740f082fefb122194ac10503"} err="failed to get container status \"3e2f6000bff516f9a22e208f6a7f2728e74a0b89740f082fefb122194ac10503\": rpc error: code = NotFound desc = could not find container \"3e2f6000bff516f9a22e208f6a7f2728e74a0b89740f082fefb122194ac10503\": container with ID starting with 3e2f6000bff516f9a22e208f6a7f2728e74a0b89740f082fefb122194ac10503 not found: ID does not exist" Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.925570 4699 scope.go:117] "RemoveContainer" containerID="e70ff67d7560be6c98247f26b9d4bbf192951b4117d07dca46e5a6e531143047" Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.932349 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06d61fb1-732e-4e73-a859-b94c63838a8a-scripts\") pod \"06d61fb1-732e-4e73-a859-b94c63838a8a\" (UID: \"06d61fb1-732e-4e73-a859-b94c63838a8a\") " Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.932468 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06d61fb1-732e-4e73-a859-b94c63838a8a-logs\") pod \"06d61fb1-732e-4e73-a859-b94c63838a8a\" (UID: \"06d61fb1-732e-4e73-a859-b94c63838a8a\") " Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.932588 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/06d61fb1-732e-4e73-a859-b94c63838a8a-httpd-run\") pod \"06d61fb1-732e-4e73-a859-b94c63838a8a\" (UID: \"06d61fb1-732e-4e73-a859-b94c63838a8a\") " Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.932647 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fdjt\" (UniqueName: \"kubernetes.io/projected/06d61fb1-732e-4e73-a859-b94c63838a8a-kube-api-access-2fdjt\") pod \"06d61fb1-732e-4e73-a859-b94c63838a8a\" (UID: \"06d61fb1-732e-4e73-a859-b94c63838a8a\") " Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.932674 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06d61fb1-732e-4e73-a859-b94c63838a8a-config-data\") pod \"06d61fb1-732e-4e73-a859-b94c63838a8a\" (UID: \"06d61fb1-732e-4e73-a859-b94c63838a8a\") " Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.932721 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/06d61fb1-732e-4e73-a859-b94c63838a8a-internal-tls-certs\") pod \"06d61fb1-732e-4e73-a859-b94c63838a8a\" (UID: \"06d61fb1-732e-4e73-a859-b94c63838a8a\") " Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.932745 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06d61fb1-732e-4e73-a859-b94c63838a8a-combined-ca-bundle\") pod \"06d61fb1-732e-4e73-a859-b94c63838a8a\" (UID: \"06d61fb1-732e-4e73-a859-b94c63838a8a\") " Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.932787 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"06d61fb1-732e-4e73-a859-b94c63838a8a\" (UID: \"06d61fb1-732e-4e73-a859-b94c63838a8a\") " Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.933129 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00e712f4-ba35-46e1-8ff2-3e1eb7615e69-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.933145 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cx665\" (UniqueName: \"kubernetes.io/projected/00e712f4-ba35-46e1-8ff2-3e1eb7615e69-kube-api-access-cx665\") on node \"crc\" DevicePath \"\"" Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.933155 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00e712f4-ba35-46e1-8ff2-3e1eb7615e69-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.933164 4699 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/00e712f4-ba35-46e1-8ff2-3e1eb7615e69-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.933182 4699 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.933191 4699 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00e712f4-ba35-46e1-8ff2-3e1eb7615e69-logs\") on node \"crc\" DevicePath \"\"" Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.933200 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00e712f4-ba35-46e1-8ff2-3e1eb7615e69-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.933208 4699 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/00e712f4-ba35-46e1-8ff2-3e1eb7615e69-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.933532 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06d61fb1-732e-4e73-a859-b94c63838a8a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "06d61fb1-732e-4e73-a859-b94c63838a8a" (UID: "06d61fb1-732e-4e73-a859-b94c63838a8a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.933863 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06d61fb1-732e-4e73-a859-b94c63838a8a-logs" (OuterVolumeSpecName: "logs") pod "06d61fb1-732e-4e73-a859-b94c63838a8a" (UID: "06d61fb1-732e-4e73-a859-b94c63838a8a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.937599 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "06d61fb1-732e-4e73-a859-b94c63838a8a" (UID: "06d61fb1-732e-4e73-a859-b94c63838a8a"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.937657 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06d61fb1-732e-4e73-a859-b94c63838a8a-scripts" (OuterVolumeSpecName: "scripts") pod "06d61fb1-732e-4e73-a859-b94c63838a8a" (UID: "06d61fb1-732e-4e73-a859-b94c63838a8a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.941098 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06d61fb1-732e-4e73-a859-b94c63838a8a-kube-api-access-2fdjt" (OuterVolumeSpecName: "kube-api-access-2fdjt") pod "06d61fb1-732e-4e73-a859-b94c63838a8a" (UID: "06d61fb1-732e-4e73-a859-b94c63838a8a"). InnerVolumeSpecName "kube-api-access-2fdjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.951104 4699 scope.go:117] "RemoveContainer" containerID="7c129a5b76f2f8ffc6901291e9e5f0b872b96d5cfb2ea850f76aaeeb877607e1" Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.962011 4699 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.962580 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06d61fb1-732e-4e73-a859-b94c63838a8a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "06d61fb1-732e-4e73-a859-b94c63838a8a" (UID: "06d61fb1-732e-4e73-a859-b94c63838a8a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.976389 4699 scope.go:117] "RemoveContainer" containerID="e70ff67d7560be6c98247f26b9d4bbf192951b4117d07dca46e5a6e531143047" Nov 22 04:27:32 crc kubenswrapper[4699]: E1122 04:27:32.976957 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e70ff67d7560be6c98247f26b9d4bbf192951b4117d07dca46e5a6e531143047\": container with ID starting with e70ff67d7560be6c98247f26b9d4bbf192951b4117d07dca46e5a6e531143047 not found: ID does not exist" containerID="e70ff67d7560be6c98247f26b9d4bbf192951b4117d07dca46e5a6e531143047" Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.977022 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e70ff67d7560be6c98247f26b9d4bbf192951b4117d07dca46e5a6e531143047"} err="failed to get container status \"e70ff67d7560be6c98247f26b9d4bbf192951b4117d07dca46e5a6e531143047\": rpc error: code = NotFound desc = could not find container \"e70ff67d7560be6c98247f26b9d4bbf192951b4117d07dca46e5a6e531143047\": container with ID starting with e70ff67d7560be6c98247f26b9d4bbf192951b4117d07dca46e5a6e531143047 not found: ID does not exist" Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.977050 4699 scope.go:117] "RemoveContainer" containerID="7c129a5b76f2f8ffc6901291e9e5f0b872b96d5cfb2ea850f76aaeeb877607e1" Nov 22 04:27:32 crc kubenswrapper[4699]: E1122 04:27:32.978033 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c129a5b76f2f8ffc6901291e9e5f0b872b96d5cfb2ea850f76aaeeb877607e1\": container with ID starting with 7c129a5b76f2f8ffc6901291e9e5f0b872b96d5cfb2ea850f76aaeeb877607e1 not found: ID does not exist" containerID="7c129a5b76f2f8ffc6901291e9e5f0b872b96d5cfb2ea850f76aaeeb877607e1" Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.978069 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c129a5b76f2f8ffc6901291e9e5f0b872b96d5cfb2ea850f76aaeeb877607e1"} err="failed to get container status \"7c129a5b76f2f8ffc6901291e9e5f0b872b96d5cfb2ea850f76aaeeb877607e1\": rpc error: code = NotFound desc = could not find container \"7c129a5b76f2f8ffc6901291e9e5f0b872b96d5cfb2ea850f76aaeeb877607e1\": container with ID starting with 7c129a5b76f2f8ffc6901291e9e5f0b872b96d5cfb2ea850f76aaeeb877607e1 not found: ID does not exist" Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.978101 4699 scope.go:117] "RemoveContainer" containerID="e70ff67d7560be6c98247f26b9d4bbf192951b4117d07dca46e5a6e531143047" Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.978472 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e70ff67d7560be6c98247f26b9d4bbf192951b4117d07dca46e5a6e531143047"} err="failed to get container status \"e70ff67d7560be6c98247f26b9d4bbf192951b4117d07dca46e5a6e531143047\": rpc error: code = NotFound desc = could not find container \"e70ff67d7560be6c98247f26b9d4bbf192951b4117d07dca46e5a6e531143047\": container with ID starting with e70ff67d7560be6c98247f26b9d4bbf192951b4117d07dca46e5a6e531143047 not found: ID does not exist" Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.978496 4699 scope.go:117] "RemoveContainer" containerID="7c129a5b76f2f8ffc6901291e9e5f0b872b96d5cfb2ea850f76aaeeb877607e1" Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.978695 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c129a5b76f2f8ffc6901291e9e5f0b872b96d5cfb2ea850f76aaeeb877607e1"} err="failed to get container status \"7c129a5b76f2f8ffc6901291e9e5f0b872b96d5cfb2ea850f76aaeeb877607e1\": rpc error: code = NotFound desc = could not find container \"7c129a5b76f2f8ffc6901291e9e5f0b872b96d5cfb2ea850f76aaeeb877607e1\": container with ID starting with 7c129a5b76f2f8ffc6901291e9e5f0b872b96d5cfb2ea850f76aaeeb877607e1 not found: ID does not exist" Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.981445 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06d61fb1-732e-4e73-a859-b94c63838a8a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "06d61fb1-732e-4e73-a859-b94c63838a8a" (UID: "06d61fb1-732e-4e73-a859-b94c63838a8a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:27:32 crc kubenswrapper[4699]: I1122 04:27:32.983870 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06d61fb1-732e-4e73-a859-b94c63838a8a-config-data" (OuterVolumeSpecName: "config-data") pod "06d61fb1-732e-4e73-a859-b94c63838a8a" (UID: "06d61fb1-732e-4e73-a859-b94c63838a8a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.035353 4699 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/06d61fb1-732e-4e73-a859-b94c63838a8a-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.035390 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fdjt\" (UniqueName: \"kubernetes.io/projected/06d61fb1-732e-4e73-a859-b94c63838a8a-kube-api-access-2fdjt\") on node \"crc\" DevicePath \"\"" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.035401 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06d61fb1-732e-4e73-a859-b94c63838a8a-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.035411 4699 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.035422 4699 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/06d61fb1-732e-4e73-a859-b94c63838a8a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.035453 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06d61fb1-732e-4e73-a859-b94c63838a8a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.035476 4699 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.035485 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06d61fb1-732e-4e73-a859-b94c63838a8a-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.035493 4699 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06d61fb1-732e-4e73-a859-b94c63838a8a-logs\") on node \"crc\" DevicePath \"\"" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.052095 4699 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.136717 4699 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.222367 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.248566 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.274625 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 04:27:33 crc kubenswrapper[4699]: E1122 04:27:33.275200 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="417b0282-cef1-4a7c-aca5-593297254fe3" containerName="dnsmasq-dns" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.275220 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="417b0282-cef1-4a7c-aca5-593297254fe3" containerName="dnsmasq-dns" Nov 22 04:27:33 crc kubenswrapper[4699]: E1122 04:27:33.275231 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06d61fb1-732e-4e73-a859-b94c63838a8a" containerName="glance-log" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.275240 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="06d61fb1-732e-4e73-a859-b94c63838a8a" containerName="glance-log" Nov 22 04:27:33 crc kubenswrapper[4699]: E1122 04:27:33.275259 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00e712f4-ba35-46e1-8ff2-3e1eb7615e69" containerName="glance-log" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.275267 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="00e712f4-ba35-46e1-8ff2-3e1eb7615e69" containerName="glance-log" Nov 22 04:27:33 crc kubenswrapper[4699]: E1122 04:27:33.275286 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00e712f4-ba35-46e1-8ff2-3e1eb7615e69" containerName="glance-httpd" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.275294 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="00e712f4-ba35-46e1-8ff2-3e1eb7615e69" containerName="glance-httpd" Nov 22 04:27:33 crc kubenswrapper[4699]: E1122 04:27:33.275308 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06d61fb1-732e-4e73-a859-b94c63838a8a" containerName="glance-httpd" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.275324 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="06d61fb1-732e-4e73-a859-b94c63838a8a" containerName="glance-httpd" Nov 22 04:27:33 crc kubenswrapper[4699]: E1122 04:27:33.275353 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="417b0282-cef1-4a7c-aca5-593297254fe3" containerName="init" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.275361 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="417b0282-cef1-4a7c-aca5-593297254fe3" containerName="init" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.275609 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="00e712f4-ba35-46e1-8ff2-3e1eb7615e69" containerName="glance-log" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.275626 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="417b0282-cef1-4a7c-aca5-593297254fe3" containerName="dnsmasq-dns" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.275637 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="06d61fb1-732e-4e73-a859-b94c63838a8a" containerName="glance-log" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.275649 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="06d61fb1-732e-4e73-a859-b94c63838a8a" containerName="glance-httpd" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.275660 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="00e712f4-ba35-46e1-8ff2-3e1eb7615e69" containerName="glance-httpd" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.278115 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.282164 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-4kqxm" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.282515 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.282666 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.283247 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.289685 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.339817 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.350823 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.358108 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.360537 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.364036 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.364366 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.367328 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.446087 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f028f32-9e14-40c5-9944-3fed1f6c2aee-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0f028f32-9e14-40c5-9944-3fed1f6c2aee\") " pod="openstack/glance-default-external-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.446287 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0f028f32-9e14-40c5-9944-3fed1f6c2aee-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0f028f32-9e14-40c5-9944-3fed1f6c2aee\") " pod="openstack/glance-default-external-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.446345 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"0f028f32-9e14-40c5-9944-3fed1f6c2aee\") " pod="openstack/glance-default-external-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.446483 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjv87\" (UniqueName: \"kubernetes.io/projected/0f028f32-9e14-40c5-9944-3fed1f6c2aee-kube-api-access-cjv87\") pod \"glance-default-external-api-0\" (UID: \"0f028f32-9e14-40c5-9944-3fed1f6c2aee\") " pod="openstack/glance-default-external-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.446507 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f028f32-9e14-40c5-9944-3fed1f6c2aee-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0f028f32-9e14-40c5-9944-3fed1f6c2aee\") " pod="openstack/glance-default-external-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.446529 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f028f32-9e14-40c5-9944-3fed1f6c2aee-logs\") pod \"glance-default-external-api-0\" (UID: \"0f028f32-9e14-40c5-9944-3fed1f6c2aee\") " pod="openstack/glance-default-external-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.446564 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f028f32-9e14-40c5-9944-3fed1f6c2aee-config-data\") pod \"glance-default-external-api-0\" (UID: \"0f028f32-9e14-40c5-9944-3fed1f6c2aee\") " pod="openstack/glance-default-external-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.446671 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f028f32-9e14-40c5-9944-3fed1f6c2aee-scripts\") pod \"glance-default-external-api-0\" (UID: \"0f028f32-9e14-40c5-9944-3fed1f6c2aee\") " pod="openstack/glance-default-external-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.457886 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00e712f4-ba35-46e1-8ff2-3e1eb7615e69" path="/var/lib/kubelet/pods/00e712f4-ba35-46e1-8ff2-3e1eb7615e69/volumes" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.459402 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06d61fb1-732e-4e73-a859-b94c63838a8a" path="/var/lib/kubelet/pods/06d61fb1-732e-4e73-a859-b94c63838a8a/volumes" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.548490 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjv87\" (UniqueName: \"kubernetes.io/projected/0f028f32-9e14-40c5-9944-3fed1f6c2aee-kube-api-access-cjv87\") pod \"glance-default-external-api-0\" (UID: \"0f028f32-9e14-40c5-9944-3fed1f6c2aee\") " pod="openstack/glance-default-external-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.548536 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c9bd4fef-05a7-44fd-9c7d-dd9118839aa6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c9bd4fef-05a7-44fd-9c7d-dd9118839aa6\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.548571 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9bd4fef-05a7-44fd-9c7d-dd9118839aa6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c9bd4fef-05a7-44fd-9c7d-dd9118839aa6\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.548592 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f028f32-9e14-40c5-9944-3fed1f6c2aee-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0f028f32-9e14-40c5-9944-3fed1f6c2aee\") " pod="openstack/glance-default-external-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.548612 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f028f32-9e14-40c5-9944-3fed1f6c2aee-logs\") pod \"glance-default-external-api-0\" (UID: \"0f028f32-9e14-40c5-9944-3fed1f6c2aee\") " pod="openstack/glance-default-external-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.548654 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f028f32-9e14-40c5-9944-3fed1f6c2aee-config-data\") pod \"glance-default-external-api-0\" (UID: \"0f028f32-9e14-40c5-9944-3fed1f6c2aee\") " pod="openstack/glance-default-external-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.548677 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9bd4fef-05a7-44fd-9c7d-dd9118839aa6-logs\") pod \"glance-default-internal-api-0\" (UID: \"c9bd4fef-05a7-44fd-9c7d-dd9118839aa6\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.548704 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f028f32-9e14-40c5-9944-3fed1f6c2aee-scripts\") pod \"glance-default-external-api-0\" (UID: \"0f028f32-9e14-40c5-9944-3fed1f6c2aee\") " pod="openstack/glance-default-external-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.548723 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9bd4fef-05a7-44fd-9c7d-dd9118839aa6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c9bd4fef-05a7-44fd-9c7d-dd9118839aa6\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.548797 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9bd4fef-05a7-44fd-9c7d-dd9118839aa6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c9bd4fef-05a7-44fd-9c7d-dd9118839aa6\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.548829 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f028f32-9e14-40c5-9944-3fed1f6c2aee-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0f028f32-9e14-40c5-9944-3fed1f6c2aee\") " pod="openstack/glance-default-external-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.548866 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9bd4fef-05a7-44fd-9c7d-dd9118839aa6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c9bd4fef-05a7-44fd-9c7d-dd9118839aa6\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.548898 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"c9bd4fef-05a7-44fd-9c7d-dd9118839aa6\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.548912 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk7lv\" (UniqueName: \"kubernetes.io/projected/c9bd4fef-05a7-44fd-9c7d-dd9118839aa6-kube-api-access-pk7lv\") pod \"glance-default-internal-api-0\" (UID: \"c9bd4fef-05a7-44fd-9c7d-dd9118839aa6\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.548927 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0f028f32-9e14-40c5-9944-3fed1f6c2aee-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0f028f32-9e14-40c5-9944-3fed1f6c2aee\") " pod="openstack/glance-default-external-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.548945 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"0f028f32-9e14-40c5-9944-3fed1f6c2aee\") " pod="openstack/glance-default-external-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.549255 4699 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"0f028f32-9e14-40c5-9944-3fed1f6c2aee\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-external-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.558813 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f028f32-9e14-40c5-9944-3fed1f6c2aee-config-data\") pod \"glance-default-external-api-0\" (UID: \"0f028f32-9e14-40c5-9944-3fed1f6c2aee\") " pod="openstack/glance-default-external-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.559122 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f028f32-9e14-40c5-9944-3fed1f6c2aee-logs\") pod \"glance-default-external-api-0\" (UID: \"0f028f32-9e14-40c5-9944-3fed1f6c2aee\") " pod="openstack/glance-default-external-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.559426 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f028f32-9e14-40c5-9944-3fed1f6c2aee-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0f028f32-9e14-40c5-9944-3fed1f6c2aee\") " pod="openstack/glance-default-external-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.560631 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f028f32-9e14-40c5-9944-3fed1f6c2aee-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0f028f32-9e14-40c5-9944-3fed1f6c2aee\") " pod="openstack/glance-default-external-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.561076 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0f028f32-9e14-40c5-9944-3fed1f6c2aee-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0f028f32-9e14-40c5-9944-3fed1f6c2aee\") " pod="openstack/glance-default-external-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.564238 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f028f32-9e14-40c5-9944-3fed1f6c2aee-scripts\") pod \"glance-default-external-api-0\" (UID: \"0f028f32-9e14-40c5-9944-3fed1f6c2aee\") " pod="openstack/glance-default-external-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.584700 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"0f028f32-9e14-40c5-9944-3fed1f6c2aee\") " pod="openstack/glance-default-external-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.585412 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjv87\" (UniqueName: \"kubernetes.io/projected/0f028f32-9e14-40c5-9944-3fed1f6c2aee-kube-api-access-cjv87\") pod \"glance-default-external-api-0\" (UID: \"0f028f32-9e14-40c5-9944-3fed1f6c2aee\") " pod="openstack/glance-default-external-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.599936 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.650368 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9bd4fef-05a7-44fd-9c7d-dd9118839aa6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c9bd4fef-05a7-44fd-9c7d-dd9118839aa6\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.650493 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"c9bd4fef-05a7-44fd-9c7d-dd9118839aa6\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.650643 4699 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"c9bd4fef-05a7-44fd-9c7d-dd9118839aa6\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.650523 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pk7lv\" (UniqueName: \"kubernetes.io/projected/c9bd4fef-05a7-44fd-9c7d-dd9118839aa6-kube-api-access-pk7lv\") pod \"glance-default-internal-api-0\" (UID: \"c9bd4fef-05a7-44fd-9c7d-dd9118839aa6\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.654653 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c9bd4fef-05a7-44fd-9c7d-dd9118839aa6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c9bd4fef-05a7-44fd-9c7d-dd9118839aa6\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.654679 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9bd4fef-05a7-44fd-9c7d-dd9118839aa6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c9bd4fef-05a7-44fd-9c7d-dd9118839aa6\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.655921 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c9bd4fef-05a7-44fd-9c7d-dd9118839aa6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c9bd4fef-05a7-44fd-9c7d-dd9118839aa6\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.658089 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9bd4fef-05a7-44fd-9c7d-dd9118839aa6-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c9bd4fef-05a7-44fd-9c7d-dd9118839aa6\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.667195 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9bd4fef-05a7-44fd-9c7d-dd9118839aa6-logs\") pod \"glance-default-internal-api-0\" (UID: \"c9bd4fef-05a7-44fd-9c7d-dd9118839aa6\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.667288 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9bd4fef-05a7-44fd-9c7d-dd9118839aa6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c9bd4fef-05a7-44fd-9c7d-dd9118839aa6\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.667416 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9bd4fef-05a7-44fd-9c7d-dd9118839aa6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c9bd4fef-05a7-44fd-9c7d-dd9118839aa6\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.667952 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9bd4fef-05a7-44fd-9c7d-dd9118839aa6-logs\") pod \"glance-default-internal-api-0\" (UID: \"c9bd4fef-05a7-44fd-9c7d-dd9118839aa6\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.671200 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk7lv\" (UniqueName: \"kubernetes.io/projected/c9bd4fef-05a7-44fd-9c7d-dd9118839aa6-kube-api-access-pk7lv\") pod \"glance-default-internal-api-0\" (UID: \"c9bd4fef-05a7-44fd-9c7d-dd9118839aa6\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.673775 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9bd4fef-05a7-44fd-9c7d-dd9118839aa6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c9bd4fef-05a7-44fd-9c7d-dd9118839aa6\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.674176 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9bd4fef-05a7-44fd-9c7d-dd9118839aa6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c9bd4fef-05a7-44fd-9c7d-dd9118839aa6\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.676044 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9bd4fef-05a7-44fd-9c7d-dd9118839aa6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c9bd4fef-05a7-44fd-9c7d-dd9118839aa6\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.678343 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"c9bd4fef-05a7-44fd-9c7d-dd9118839aa6\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:27:33 crc kubenswrapper[4699]: I1122 04:27:33.682994 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 04:27:37 crc kubenswrapper[4699]: E1122 04:27:37.197178 4699 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod417b0282_cef1_4a7c_aca5_593297254fe3.slice/crio-d9be3bb7fa7a0781dffe600d9a6395a984bfdc0273d8e5fcae602ef27fcca7e9\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod417b0282_cef1_4a7c_aca5_593297254fe3.slice\": RecentStats: unable to find data in memory cache]" Nov 22 04:27:37 crc kubenswrapper[4699]: I1122 04:27:37.924864 4699 generic.go:334] "Generic (PLEG): container finished" podID="ae96d89d-006a-4e7c-a42b-916cc7c77d19" containerID="ed5ad378ee9c82aaf3a68b092e4b798ba21ae9f666d69475f0a13e6d97a8c3c2" exitCode=0 Nov 22 04:27:37 crc kubenswrapper[4699]: I1122 04:27:37.924953 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5bqjq" event={"ID":"ae96d89d-006a-4e7c-a42b-916cc7c77d19","Type":"ContainerDied","Data":"ed5ad378ee9c82aaf3a68b092e4b798ba21ae9f666d69475f0a13e6d97a8c3c2"} Nov 22 04:27:38 crc kubenswrapper[4699]: I1122 04:27:38.725636 4699 patch_prober.go:28] interesting pod/machine-config-daemon-kjwnt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:27:38 crc kubenswrapper[4699]: I1122 04:27:38.725691 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:27:39 crc kubenswrapper[4699]: I1122 04:27:39.741152 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 04:27:43 crc kubenswrapper[4699]: W1122 04:27:43.071552 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f028f32_9e14_40c5_9944_3fed1f6c2aee.slice/crio-7b1dc965d67858085fc71a25f103fa3eb8408de947fee264fd2c8557578b21e7 WatchSource:0}: Error finding container 7b1dc965d67858085fc71a25f103fa3eb8408de947fee264fd2c8557578b21e7: Status 404 returned error can't find the container with id 7b1dc965d67858085fc71a25f103fa3eb8408de947fee264fd2c8557578b21e7 Nov 22 04:27:43 crc kubenswrapper[4699]: I1122 04:27:43.257673 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5bqjq" Nov 22 04:27:43 crc kubenswrapper[4699]: I1122 04:27:43.380969 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae96d89d-006a-4e7c-a42b-916cc7c77d19-scripts\") pod \"ae96d89d-006a-4e7c-a42b-916cc7c77d19\" (UID: \"ae96d89d-006a-4e7c-a42b-916cc7c77d19\") " Nov 22 04:27:43 crc kubenswrapper[4699]: I1122 04:27:43.381718 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ae96d89d-006a-4e7c-a42b-916cc7c77d19-fernet-keys\") pod \"ae96d89d-006a-4e7c-a42b-916cc7c77d19\" (UID: \"ae96d89d-006a-4e7c-a42b-916cc7c77d19\") " Nov 22 04:27:43 crc kubenswrapper[4699]: I1122 04:27:43.382206 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dz8mc\" (UniqueName: \"kubernetes.io/projected/ae96d89d-006a-4e7c-a42b-916cc7c77d19-kube-api-access-dz8mc\") pod \"ae96d89d-006a-4e7c-a42b-916cc7c77d19\" (UID: \"ae96d89d-006a-4e7c-a42b-916cc7c77d19\") " Nov 22 04:27:43 crc kubenswrapper[4699]: I1122 04:27:43.382265 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae96d89d-006a-4e7c-a42b-916cc7c77d19-config-data\") pod \"ae96d89d-006a-4e7c-a42b-916cc7c77d19\" (UID: \"ae96d89d-006a-4e7c-a42b-916cc7c77d19\") " Nov 22 04:27:43 crc kubenswrapper[4699]: I1122 04:27:43.382702 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae96d89d-006a-4e7c-a42b-916cc7c77d19-combined-ca-bundle\") pod \"ae96d89d-006a-4e7c-a42b-916cc7c77d19\" (UID: \"ae96d89d-006a-4e7c-a42b-916cc7c77d19\") " Nov 22 04:27:43 crc kubenswrapper[4699]: I1122 04:27:43.382810 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ae96d89d-006a-4e7c-a42b-916cc7c77d19-credential-keys\") pod \"ae96d89d-006a-4e7c-a42b-916cc7c77d19\" (UID: \"ae96d89d-006a-4e7c-a42b-916cc7c77d19\") " Nov 22 04:27:43 crc kubenswrapper[4699]: I1122 04:27:43.385940 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae96d89d-006a-4e7c-a42b-916cc7c77d19-scripts" (OuterVolumeSpecName: "scripts") pod "ae96d89d-006a-4e7c-a42b-916cc7c77d19" (UID: "ae96d89d-006a-4e7c-a42b-916cc7c77d19"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:27:43 crc kubenswrapper[4699]: I1122 04:27:43.386022 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae96d89d-006a-4e7c-a42b-916cc7c77d19-kube-api-access-dz8mc" (OuterVolumeSpecName: "kube-api-access-dz8mc") pod "ae96d89d-006a-4e7c-a42b-916cc7c77d19" (UID: "ae96d89d-006a-4e7c-a42b-916cc7c77d19"). InnerVolumeSpecName "kube-api-access-dz8mc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:27:43 crc kubenswrapper[4699]: I1122 04:27:43.386654 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae96d89d-006a-4e7c-a42b-916cc7c77d19-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ae96d89d-006a-4e7c-a42b-916cc7c77d19" (UID: "ae96d89d-006a-4e7c-a42b-916cc7c77d19"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:27:43 crc kubenswrapper[4699]: I1122 04:27:43.396903 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae96d89d-006a-4e7c-a42b-916cc7c77d19-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "ae96d89d-006a-4e7c-a42b-916cc7c77d19" (UID: "ae96d89d-006a-4e7c-a42b-916cc7c77d19"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:27:43 crc kubenswrapper[4699]: I1122 04:27:43.409914 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae96d89d-006a-4e7c-a42b-916cc7c77d19-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ae96d89d-006a-4e7c-a42b-916cc7c77d19" (UID: "ae96d89d-006a-4e7c-a42b-916cc7c77d19"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:27:43 crc kubenswrapper[4699]: I1122 04:27:43.412526 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae96d89d-006a-4e7c-a42b-916cc7c77d19-config-data" (OuterVolumeSpecName: "config-data") pod "ae96d89d-006a-4e7c-a42b-916cc7c77d19" (UID: "ae96d89d-006a-4e7c-a42b-916cc7c77d19"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:27:43 crc kubenswrapper[4699]: I1122 04:27:43.484613 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae96d89d-006a-4e7c-a42b-916cc7c77d19-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:27:43 crc kubenswrapper[4699]: I1122 04:27:43.484649 4699 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ae96d89d-006a-4e7c-a42b-916cc7c77d19-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 22 04:27:43 crc kubenswrapper[4699]: I1122 04:27:43.484659 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae96d89d-006a-4e7c-a42b-916cc7c77d19-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:27:43 crc kubenswrapper[4699]: I1122 04:27:43.484667 4699 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ae96d89d-006a-4e7c-a42b-916cc7c77d19-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 22 04:27:43 crc kubenswrapper[4699]: I1122 04:27:43.484677 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dz8mc\" (UniqueName: \"kubernetes.io/projected/ae96d89d-006a-4e7c-a42b-916cc7c77d19-kube-api-access-dz8mc\") on node \"crc\" DevicePath \"\"" Nov 22 04:27:43 crc kubenswrapper[4699]: I1122 04:27:43.484687 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae96d89d-006a-4e7c-a42b-916cc7c77d19-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:27:43 crc kubenswrapper[4699]: I1122 04:27:43.623147 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 04:27:43 crc kubenswrapper[4699]: W1122 04:27:43.651143 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9bd4fef_05a7_44fd_9c7d_dd9118839aa6.slice/crio-2a94d26bbb79e144ca368c998909adc7edaf85b5f7dc3b12851ebc2340147d36 WatchSource:0}: Error finding container 2a94d26bbb79e144ca368c998909adc7edaf85b5f7dc3b12851ebc2340147d36: Status 404 returned error can't find the container with id 2a94d26bbb79e144ca368c998909adc7edaf85b5f7dc3b12851ebc2340147d36 Nov 22 04:27:43 crc kubenswrapper[4699]: I1122 04:27:43.998722 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0f028f32-9e14-40c5-9944-3fed1f6c2aee","Type":"ContainerStarted","Data":"af2485d201b5d7082d752db10c95f6a969bf19c959ffbcb882551a8bf0e3248a"} Nov 22 04:27:43 crc kubenswrapper[4699]: I1122 04:27:43.998764 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0f028f32-9e14-40c5-9944-3fed1f6c2aee","Type":"ContainerStarted","Data":"7b1dc965d67858085fc71a25f103fa3eb8408de947fee264fd2c8557578b21e7"} Nov 22 04:27:44 crc kubenswrapper[4699]: I1122 04:27:44.003217 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5bqjq" event={"ID":"ae96d89d-006a-4e7c-a42b-916cc7c77d19","Type":"ContainerDied","Data":"146b31f5e674ad70d115ed162deb0ed3f1020909d819ee65aefb65627b397e9d"} Nov 22 04:27:44 crc kubenswrapper[4699]: I1122 04:27:44.003265 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="146b31f5e674ad70d115ed162deb0ed3f1020909d819ee65aefb65627b397e9d" Nov 22 04:27:44 crc kubenswrapper[4699]: I1122 04:27:44.003296 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5bqjq" Nov 22 04:27:44 crc kubenswrapper[4699]: I1122 04:27:44.013980 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"058a0faa-f3ce-4c0e-b7f0-e3f915f5d887","Type":"ContainerStarted","Data":"fd90a82aa572442ced06c826d38b474d6390c6e7d82f45a29dae451b3980c5f5"} Nov 22 04:27:44 crc kubenswrapper[4699]: I1122 04:27:44.016569 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mf25h" event={"ID":"a79a788b-1b1c-45df-9c90-3c30d382691b","Type":"ContainerStarted","Data":"8d7efbd273318955f317598578a519fd4d002838c29a6f9db5a0fec055aba67f"} Nov 22 04:27:44 crc kubenswrapper[4699]: I1122 04:27:44.019322 4699 generic.go:334] "Generic (PLEG): container finished" podID="19251598-5cdb-4e4f-9eb7-05cd21d988fb" containerID="2a481bb7f15a6a4791a58235892837aff3dffb9d07bac256ec026f24367109ad" exitCode=0 Nov 22 04:27:44 crc kubenswrapper[4699]: I1122 04:27:44.019375 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-bd6j2" event={"ID":"19251598-5cdb-4e4f-9eb7-05cd21d988fb","Type":"ContainerDied","Data":"2a481bb7f15a6a4791a58235892837aff3dffb9d07bac256ec026f24367109ad"} Nov 22 04:27:44 crc kubenswrapper[4699]: I1122 04:27:44.023732 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c9bd4fef-05a7-44fd-9c7d-dd9118839aa6","Type":"ContainerStarted","Data":"2a94d26bbb79e144ca368c998909adc7edaf85b5f7dc3b12851ebc2340147d36"} Nov 22 04:27:44 crc kubenswrapper[4699]: I1122 04:27:44.040522 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-mf25h" podStartSLOduration=2.557240118 podStartE2EDuration="1m11.040502177s" podCreationTimestamp="2025-11-22 04:26:33 +0000 UTC" firstStartedPulling="2025-11-22 04:26:35.149522983 +0000 UTC m=+1146.492144170" lastFinishedPulling="2025-11-22 04:27:43.632785032 +0000 UTC m=+1214.975406229" observedRunningTime="2025-11-22 04:27:44.032345279 +0000 UTC m=+1215.374966476" watchObservedRunningTime="2025-11-22 04:27:44.040502177 +0000 UTC m=+1215.383123364" Nov 22 04:27:44 crc kubenswrapper[4699]: I1122 04:27:44.507340 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6bf6559788-s4hk6"] Nov 22 04:27:44 crc kubenswrapper[4699]: E1122 04:27:44.507883 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae96d89d-006a-4e7c-a42b-916cc7c77d19" containerName="keystone-bootstrap" Nov 22 04:27:44 crc kubenswrapper[4699]: I1122 04:27:44.507896 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae96d89d-006a-4e7c-a42b-916cc7c77d19" containerName="keystone-bootstrap" Nov 22 04:27:44 crc kubenswrapper[4699]: I1122 04:27:44.508046 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae96d89d-006a-4e7c-a42b-916cc7c77d19" containerName="keystone-bootstrap" Nov 22 04:27:44 crc kubenswrapper[4699]: I1122 04:27:44.513063 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6bf6559788-s4hk6" Nov 22 04:27:44 crc kubenswrapper[4699]: I1122 04:27:44.520735 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 22 04:27:44 crc kubenswrapper[4699]: I1122 04:27:44.521073 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 22 04:27:44 crc kubenswrapper[4699]: I1122 04:27:44.522006 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Nov 22 04:27:44 crc kubenswrapper[4699]: I1122 04:27:44.522217 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-qfldf" Nov 22 04:27:44 crc kubenswrapper[4699]: I1122 04:27:44.522658 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 22 04:27:44 crc kubenswrapper[4699]: I1122 04:27:44.536918 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Nov 22 04:27:44 crc kubenswrapper[4699]: I1122 04:27:44.538382 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6bf6559788-s4hk6"] Nov 22 04:27:44 crc kubenswrapper[4699]: I1122 04:27:44.611013 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15aff0a7-6c4f-449c-addf-6cea805a4820-config-data\") pod \"keystone-6bf6559788-s4hk6\" (UID: \"15aff0a7-6c4f-449c-addf-6cea805a4820\") " pod="openstack/keystone-6bf6559788-s4hk6" Nov 22 04:27:44 crc kubenswrapper[4699]: I1122 04:27:44.611310 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj55v\" (UniqueName: \"kubernetes.io/projected/15aff0a7-6c4f-449c-addf-6cea805a4820-kube-api-access-pj55v\") pod \"keystone-6bf6559788-s4hk6\" (UID: \"15aff0a7-6c4f-449c-addf-6cea805a4820\") " pod="openstack/keystone-6bf6559788-s4hk6" Nov 22 04:27:44 crc kubenswrapper[4699]: I1122 04:27:44.611425 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/15aff0a7-6c4f-449c-addf-6cea805a4820-internal-tls-certs\") pod \"keystone-6bf6559788-s4hk6\" (UID: \"15aff0a7-6c4f-449c-addf-6cea805a4820\") " pod="openstack/keystone-6bf6559788-s4hk6" Nov 22 04:27:44 crc kubenswrapper[4699]: I1122 04:27:44.611678 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/15aff0a7-6c4f-449c-addf-6cea805a4820-fernet-keys\") pod \"keystone-6bf6559788-s4hk6\" (UID: \"15aff0a7-6c4f-449c-addf-6cea805a4820\") " pod="openstack/keystone-6bf6559788-s4hk6" Nov 22 04:27:44 crc kubenswrapper[4699]: I1122 04:27:44.616491 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15aff0a7-6c4f-449c-addf-6cea805a4820-scripts\") pod \"keystone-6bf6559788-s4hk6\" (UID: \"15aff0a7-6c4f-449c-addf-6cea805a4820\") " pod="openstack/keystone-6bf6559788-s4hk6" Nov 22 04:27:44 crc kubenswrapper[4699]: I1122 04:27:44.616660 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15aff0a7-6c4f-449c-addf-6cea805a4820-combined-ca-bundle\") pod \"keystone-6bf6559788-s4hk6\" (UID: \"15aff0a7-6c4f-449c-addf-6cea805a4820\") " pod="openstack/keystone-6bf6559788-s4hk6" Nov 22 04:27:44 crc kubenswrapper[4699]: I1122 04:27:44.616813 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/15aff0a7-6c4f-449c-addf-6cea805a4820-public-tls-certs\") pod \"keystone-6bf6559788-s4hk6\" (UID: \"15aff0a7-6c4f-449c-addf-6cea805a4820\") " pod="openstack/keystone-6bf6559788-s4hk6" Nov 22 04:27:44 crc kubenswrapper[4699]: I1122 04:27:44.616920 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/15aff0a7-6c4f-449c-addf-6cea805a4820-credential-keys\") pod \"keystone-6bf6559788-s4hk6\" (UID: \"15aff0a7-6c4f-449c-addf-6cea805a4820\") " pod="openstack/keystone-6bf6559788-s4hk6" Nov 22 04:27:44 crc kubenswrapper[4699]: I1122 04:27:44.718216 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15aff0a7-6c4f-449c-addf-6cea805a4820-combined-ca-bundle\") pod \"keystone-6bf6559788-s4hk6\" (UID: \"15aff0a7-6c4f-449c-addf-6cea805a4820\") " pod="openstack/keystone-6bf6559788-s4hk6" Nov 22 04:27:44 crc kubenswrapper[4699]: I1122 04:27:44.718773 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/15aff0a7-6c4f-449c-addf-6cea805a4820-public-tls-certs\") pod \"keystone-6bf6559788-s4hk6\" (UID: \"15aff0a7-6c4f-449c-addf-6cea805a4820\") " pod="openstack/keystone-6bf6559788-s4hk6" Nov 22 04:27:44 crc kubenswrapper[4699]: I1122 04:27:44.718835 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/15aff0a7-6c4f-449c-addf-6cea805a4820-credential-keys\") pod \"keystone-6bf6559788-s4hk6\" (UID: \"15aff0a7-6c4f-449c-addf-6cea805a4820\") " pod="openstack/keystone-6bf6559788-s4hk6" Nov 22 04:27:44 crc kubenswrapper[4699]: I1122 04:27:44.718913 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15aff0a7-6c4f-449c-addf-6cea805a4820-config-data\") pod \"keystone-6bf6559788-s4hk6\" (UID: \"15aff0a7-6c4f-449c-addf-6cea805a4820\") " pod="openstack/keystone-6bf6559788-s4hk6" Nov 22 04:27:44 crc kubenswrapper[4699]: I1122 04:27:44.718944 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj55v\" (UniqueName: \"kubernetes.io/projected/15aff0a7-6c4f-449c-addf-6cea805a4820-kube-api-access-pj55v\") pod \"keystone-6bf6559788-s4hk6\" (UID: \"15aff0a7-6c4f-449c-addf-6cea805a4820\") " pod="openstack/keystone-6bf6559788-s4hk6" Nov 22 04:27:44 crc kubenswrapper[4699]: I1122 04:27:44.718968 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/15aff0a7-6c4f-449c-addf-6cea805a4820-internal-tls-certs\") pod \"keystone-6bf6559788-s4hk6\" (UID: \"15aff0a7-6c4f-449c-addf-6cea805a4820\") " pod="openstack/keystone-6bf6559788-s4hk6" Nov 22 04:27:44 crc kubenswrapper[4699]: I1122 04:27:44.718991 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/15aff0a7-6c4f-449c-addf-6cea805a4820-fernet-keys\") pod \"keystone-6bf6559788-s4hk6\" (UID: \"15aff0a7-6c4f-449c-addf-6cea805a4820\") " pod="openstack/keystone-6bf6559788-s4hk6" Nov 22 04:27:44 crc kubenswrapper[4699]: I1122 04:27:44.719072 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15aff0a7-6c4f-449c-addf-6cea805a4820-scripts\") pod \"keystone-6bf6559788-s4hk6\" (UID: \"15aff0a7-6c4f-449c-addf-6cea805a4820\") " pod="openstack/keystone-6bf6559788-s4hk6" Nov 22 04:27:44 crc kubenswrapper[4699]: I1122 04:27:44.725839 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/15aff0a7-6c4f-449c-addf-6cea805a4820-credential-keys\") pod \"keystone-6bf6559788-s4hk6\" (UID: \"15aff0a7-6c4f-449c-addf-6cea805a4820\") " pod="openstack/keystone-6bf6559788-s4hk6" Nov 22 04:27:44 crc kubenswrapper[4699]: I1122 04:27:44.726469 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15aff0a7-6c4f-449c-addf-6cea805a4820-combined-ca-bundle\") pod \"keystone-6bf6559788-s4hk6\" (UID: \"15aff0a7-6c4f-449c-addf-6cea805a4820\") " pod="openstack/keystone-6bf6559788-s4hk6" Nov 22 04:27:44 crc kubenswrapper[4699]: I1122 04:27:44.727163 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15aff0a7-6c4f-449c-addf-6cea805a4820-config-data\") pod \"keystone-6bf6559788-s4hk6\" (UID: \"15aff0a7-6c4f-449c-addf-6cea805a4820\") " pod="openstack/keystone-6bf6559788-s4hk6" Nov 22 04:27:44 crc kubenswrapper[4699]: I1122 04:27:44.727179 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/15aff0a7-6c4f-449c-addf-6cea805a4820-public-tls-certs\") pod \"keystone-6bf6559788-s4hk6\" (UID: \"15aff0a7-6c4f-449c-addf-6cea805a4820\") " pod="openstack/keystone-6bf6559788-s4hk6" Nov 22 04:27:44 crc kubenswrapper[4699]: I1122 04:27:44.727710 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/15aff0a7-6c4f-449c-addf-6cea805a4820-internal-tls-certs\") pod \"keystone-6bf6559788-s4hk6\" (UID: \"15aff0a7-6c4f-449c-addf-6cea805a4820\") " pod="openstack/keystone-6bf6559788-s4hk6" Nov 22 04:27:44 crc kubenswrapper[4699]: I1122 04:27:44.729763 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/15aff0a7-6c4f-449c-addf-6cea805a4820-fernet-keys\") pod \"keystone-6bf6559788-s4hk6\" (UID: \"15aff0a7-6c4f-449c-addf-6cea805a4820\") " pod="openstack/keystone-6bf6559788-s4hk6" Nov 22 04:27:44 crc kubenswrapper[4699]: I1122 04:27:44.730782 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15aff0a7-6c4f-449c-addf-6cea805a4820-scripts\") pod \"keystone-6bf6559788-s4hk6\" (UID: \"15aff0a7-6c4f-449c-addf-6cea805a4820\") " pod="openstack/keystone-6bf6559788-s4hk6" Nov 22 04:27:44 crc kubenswrapper[4699]: I1122 04:27:44.738978 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj55v\" (UniqueName: \"kubernetes.io/projected/15aff0a7-6c4f-449c-addf-6cea805a4820-kube-api-access-pj55v\") pod \"keystone-6bf6559788-s4hk6\" (UID: \"15aff0a7-6c4f-449c-addf-6cea805a4820\") " pod="openstack/keystone-6bf6559788-s4hk6" Nov 22 04:27:44 crc kubenswrapper[4699]: I1122 04:27:44.898720 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6bf6559788-s4hk6" Nov 22 04:27:45 crc kubenswrapper[4699]: I1122 04:27:45.051923 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zb5vb" event={"ID":"3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b","Type":"ContainerStarted","Data":"cf76bc5ff52b00f4f1862b5dd35e5462be22707578a7621c50daad89a2ee162c"} Nov 22 04:27:45 crc kubenswrapper[4699]: I1122 04:27:45.071616 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-zb5vb" podStartSLOduration=7.092733683 podStartE2EDuration="1m12.071595205s" podCreationTimestamp="2025-11-22 04:26:33 +0000 UTC" firstStartedPulling="2025-11-22 04:26:38.79994871 +0000 UTC m=+1150.142569897" lastFinishedPulling="2025-11-22 04:27:43.778810232 +0000 UTC m=+1215.121431419" observedRunningTime="2025-11-22 04:27:45.067578157 +0000 UTC m=+1216.410199364" watchObservedRunningTime="2025-11-22 04:27:45.071595205 +0000 UTC m=+1216.414216392" Nov 22 04:27:45 crc kubenswrapper[4699]: I1122 04:27:45.073976 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-bd6j2" event={"ID":"19251598-5cdb-4e4f-9eb7-05cd21d988fb","Type":"ContainerStarted","Data":"d52ea8982be66097720e91090ec7a19a59fb12cb7bf19c4c25f043ea53daf19c"} Nov 22 04:27:45 crc kubenswrapper[4699]: I1122 04:27:45.082935 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c9bd4fef-05a7-44fd-9c7d-dd9118839aa6","Type":"ContainerStarted","Data":"0965e0e4b0f9d33255e1ca3fe32acf964e60ff77ca0da40d7a5561ecbf5baff7"} Nov 22 04:27:45 crc kubenswrapper[4699]: I1122 04:27:45.090798 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fgx2c" event={"ID":"a2442edb-5370-4fd9-af87-6cb17498cee6","Type":"ContainerStarted","Data":"96132b23ec10daaf82debb2041fe6a9acbd17fb95fd2543bf6497612064673cd"} Nov 22 04:27:45 crc kubenswrapper[4699]: I1122 04:27:45.103350 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0f028f32-9e14-40c5-9944-3fed1f6c2aee","Type":"ContainerStarted","Data":"9c3a6f68089699e05f57c467995c468566c3aee2a5f9f2e59cb387045899b545"} Nov 22 04:27:45 crc kubenswrapper[4699]: I1122 04:27:45.109128 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-db-sync-bd6j2" podStartSLOduration=45.068691026 podStartE2EDuration="57.109107124s" podCreationTimestamp="2025-11-22 04:26:48 +0000 UTC" firstStartedPulling="2025-11-22 04:27:31.079871778 +0000 UTC m=+1202.422492965" lastFinishedPulling="2025-11-22 04:27:43.120287876 +0000 UTC m=+1214.462909063" observedRunningTime="2025-11-22 04:27:45.102990996 +0000 UTC m=+1216.445612183" watchObservedRunningTime="2025-11-22 04:27:45.109107124 +0000 UTC m=+1216.451728311" Nov 22 04:27:45 crc kubenswrapper[4699]: I1122 04:27:45.156668 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-fgx2c" podStartSLOduration=3.491985281 podStartE2EDuration="1m12.156650347s" podCreationTimestamp="2025-11-22 04:26:33 +0000 UTC" firstStartedPulling="2025-11-22 04:26:34.968121446 +0000 UTC m=+1146.310742633" lastFinishedPulling="2025-11-22 04:27:43.632786512 +0000 UTC m=+1214.975407699" observedRunningTime="2025-11-22 04:27:45.1308036 +0000 UTC m=+1216.473424797" watchObservedRunningTime="2025-11-22 04:27:45.156650347 +0000 UTC m=+1216.499271534" Nov 22 04:27:45 crc kubenswrapper[4699]: I1122 04:27:45.171308 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=12.171290182 podStartE2EDuration="12.171290182s" podCreationTimestamp="2025-11-22 04:27:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:27:45.168695229 +0000 UTC m=+1216.511316426" watchObservedRunningTime="2025-11-22 04:27:45.171290182 +0000 UTC m=+1216.513911369" Nov 22 04:27:45 crc kubenswrapper[4699]: I1122 04:27:45.439578 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6bf6559788-s4hk6"] Nov 22 04:27:45 crc kubenswrapper[4699]: W1122 04:27:45.451656 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15aff0a7_6c4f_449c_addf_6cea805a4820.slice/crio-3cb615968cb4fb2e5be63aac36cdd1133cf0fc961be2cb6d4f3c0467837e2f62 WatchSource:0}: Error finding container 3cb615968cb4fb2e5be63aac36cdd1133cf0fc961be2cb6d4f3c0467837e2f62: Status 404 returned error can't find the container with id 3cb615968cb4fb2e5be63aac36cdd1133cf0fc961be2cb6d4f3c0467837e2f62 Nov 22 04:27:46 crc kubenswrapper[4699]: I1122 04:27:46.117357 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c9bd4fef-05a7-44fd-9c7d-dd9118839aa6","Type":"ContainerStarted","Data":"adfcd113ddb0a425861801cc4ebb792f2beb998c582350cf8e362757c1b7afa5"} Nov 22 04:27:46 crc kubenswrapper[4699]: I1122 04:27:46.125355 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6bf6559788-s4hk6" event={"ID":"15aff0a7-6c4f-449c-addf-6cea805a4820","Type":"ContainerStarted","Data":"5ca79670fb37824e300ea47eae4d45056e87805cbaaf406187fd72f5367c008c"} Nov 22 04:27:46 crc kubenswrapper[4699]: I1122 04:27:46.125529 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6bf6559788-s4hk6" event={"ID":"15aff0a7-6c4f-449c-addf-6cea805a4820","Type":"ContainerStarted","Data":"3cb615968cb4fb2e5be63aac36cdd1133cf0fc961be2cb6d4f3c0467837e2f62"} Nov 22 04:27:46 crc kubenswrapper[4699]: I1122 04:27:46.126873 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-6bf6559788-s4hk6" Nov 22 04:27:46 crc kubenswrapper[4699]: I1122 04:27:46.145019 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=13.145001519 podStartE2EDuration="13.145001519s" podCreationTimestamp="2025-11-22 04:27:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:27:46.137124068 +0000 UTC m=+1217.479745255" watchObservedRunningTime="2025-11-22 04:27:46.145001519 +0000 UTC m=+1217.487622706" Nov 22 04:27:46 crc kubenswrapper[4699]: I1122 04:27:46.162209 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6bf6559788-s4hk6" podStartSLOduration=2.162184066 podStartE2EDuration="2.162184066s" podCreationTimestamp="2025-11-22 04:27:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:27:46.15782657 +0000 UTC m=+1217.500447757" watchObservedRunningTime="2025-11-22 04:27:46.162184066 +0000 UTC m=+1217.504805253" Nov 22 04:27:47 crc kubenswrapper[4699]: E1122 04:27:47.444529 4699 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod417b0282_cef1_4a7c_aca5_593297254fe3.slice/crio-d9be3bb7fa7a0781dffe600d9a6395a984bfdc0273d8e5fcae602ef27fcca7e9\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod417b0282_cef1_4a7c_aca5_593297254fe3.slice\": RecentStats: unable to find data in memory cache]" Nov 22 04:27:51 crc kubenswrapper[4699]: I1122 04:27:51.170248 4699 generic.go:334] "Generic (PLEG): container finished" podID="3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b" containerID="cf76bc5ff52b00f4f1862b5dd35e5462be22707578a7621c50daad89a2ee162c" exitCode=0 Nov 22 04:27:51 crc kubenswrapper[4699]: I1122 04:27:51.170336 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zb5vb" event={"ID":"3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b","Type":"ContainerDied","Data":"cf76bc5ff52b00f4f1862b5dd35e5462be22707578a7621c50daad89a2ee162c"} Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.012263 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zb5vb" Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.176269 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b-scripts\") pod \"3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b\" (UID: \"3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b\") " Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.176376 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b-config-data\") pod \"3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b\" (UID: \"3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b\") " Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.176395 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29zst\" (UniqueName: \"kubernetes.io/projected/3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b-kube-api-access-29zst\") pod \"3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b\" (UID: \"3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b\") " Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.176426 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b-combined-ca-bundle\") pod \"3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b\" (UID: \"3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b\") " Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.176560 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b-logs\") pod \"3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b\" (UID: \"3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b\") " Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.177096 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b-logs" (OuterVolumeSpecName: "logs") pod "3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b" (UID: "3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.181915 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b-kube-api-access-29zst" (OuterVolumeSpecName: "kube-api-access-29zst") pod "3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b" (UID: "3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b"). InnerVolumeSpecName "kube-api-access-29zst". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.182169 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b-scripts" (OuterVolumeSpecName: "scripts") pod "3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b" (UID: "3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.191759 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zb5vb" event={"ID":"3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b","Type":"ContainerDied","Data":"476ef44dd4a9f643bb9383a43667c19f3733e804c8f1eb13e3696f0193414f23"} Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.191795 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="476ef44dd4a9f643bb9383a43667c19f3733e804c8f1eb13e3696f0193414f23" Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.192494 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zb5vb" Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.205503 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b-config-data" (OuterVolumeSpecName: "config-data") pod "3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b" (UID: "3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.207634 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b" (UID: "3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.278620 4699 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b-logs\") on node \"crc\" DevicePath \"\"" Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.278649 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.278661 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.278671 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29zst\" (UniqueName: \"kubernetes.io/projected/3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b-kube-api-access-29zst\") on node \"crc\" DevicePath \"\"" Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.278684 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.424070 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-64688bf4db-vwnwg"] Nov 22 04:27:53 crc kubenswrapper[4699]: E1122 04:27:53.424655 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b" containerName="placement-db-sync" Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.424668 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b" containerName="placement-db-sync" Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.424846 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b" containerName="placement-db-sync" Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.425703 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-64688bf4db-vwnwg" Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.428455 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.428895 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.441270 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-64688bf4db-vwnwg"] Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.583128 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/11aab908-3152-4d7b-bfb3-b4f3e04bb7a8-internal-tls-certs\") pod \"placement-64688bf4db-vwnwg\" (UID: \"11aab908-3152-4d7b-bfb3-b4f3e04bb7a8\") " pod="openstack/placement-64688bf4db-vwnwg" Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.583173 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gz8kn\" (UniqueName: \"kubernetes.io/projected/11aab908-3152-4d7b-bfb3-b4f3e04bb7a8-kube-api-access-gz8kn\") pod \"placement-64688bf4db-vwnwg\" (UID: \"11aab908-3152-4d7b-bfb3-b4f3e04bb7a8\") " pod="openstack/placement-64688bf4db-vwnwg" Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.583197 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11aab908-3152-4d7b-bfb3-b4f3e04bb7a8-config-data\") pod \"placement-64688bf4db-vwnwg\" (UID: \"11aab908-3152-4d7b-bfb3-b4f3e04bb7a8\") " pod="openstack/placement-64688bf4db-vwnwg" Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.583318 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/11aab908-3152-4d7b-bfb3-b4f3e04bb7a8-public-tls-certs\") pod \"placement-64688bf4db-vwnwg\" (UID: \"11aab908-3152-4d7b-bfb3-b4f3e04bb7a8\") " pod="openstack/placement-64688bf4db-vwnwg" Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.583345 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11aab908-3152-4d7b-bfb3-b4f3e04bb7a8-combined-ca-bundle\") pod \"placement-64688bf4db-vwnwg\" (UID: \"11aab908-3152-4d7b-bfb3-b4f3e04bb7a8\") " pod="openstack/placement-64688bf4db-vwnwg" Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.583390 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11aab908-3152-4d7b-bfb3-b4f3e04bb7a8-logs\") pod \"placement-64688bf4db-vwnwg\" (UID: \"11aab908-3152-4d7b-bfb3-b4f3e04bb7a8\") " pod="openstack/placement-64688bf4db-vwnwg" Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.583461 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11aab908-3152-4d7b-bfb3-b4f3e04bb7a8-scripts\") pod \"placement-64688bf4db-vwnwg\" (UID: \"11aab908-3152-4d7b-bfb3-b4f3e04bb7a8\") " pod="openstack/placement-64688bf4db-vwnwg" Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.600551 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.600595 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.640937 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.640999 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.683800 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.683871 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.685305 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/11aab908-3152-4d7b-bfb3-b4f3e04bb7a8-internal-tls-certs\") pod \"placement-64688bf4db-vwnwg\" (UID: \"11aab908-3152-4d7b-bfb3-b4f3e04bb7a8\") " pod="openstack/placement-64688bf4db-vwnwg" Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.685348 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gz8kn\" (UniqueName: \"kubernetes.io/projected/11aab908-3152-4d7b-bfb3-b4f3e04bb7a8-kube-api-access-gz8kn\") pod \"placement-64688bf4db-vwnwg\" (UID: \"11aab908-3152-4d7b-bfb3-b4f3e04bb7a8\") " pod="openstack/placement-64688bf4db-vwnwg" Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.685371 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11aab908-3152-4d7b-bfb3-b4f3e04bb7a8-config-data\") pod \"placement-64688bf4db-vwnwg\" (UID: \"11aab908-3152-4d7b-bfb3-b4f3e04bb7a8\") " pod="openstack/placement-64688bf4db-vwnwg" Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.685481 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/11aab908-3152-4d7b-bfb3-b4f3e04bb7a8-public-tls-certs\") pod \"placement-64688bf4db-vwnwg\" (UID: \"11aab908-3152-4d7b-bfb3-b4f3e04bb7a8\") " pod="openstack/placement-64688bf4db-vwnwg" Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.685502 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11aab908-3152-4d7b-bfb3-b4f3e04bb7a8-combined-ca-bundle\") pod \"placement-64688bf4db-vwnwg\" (UID: \"11aab908-3152-4d7b-bfb3-b4f3e04bb7a8\") " pod="openstack/placement-64688bf4db-vwnwg" Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.685539 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11aab908-3152-4d7b-bfb3-b4f3e04bb7a8-logs\") pod \"placement-64688bf4db-vwnwg\" (UID: \"11aab908-3152-4d7b-bfb3-b4f3e04bb7a8\") " pod="openstack/placement-64688bf4db-vwnwg" Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.685587 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11aab908-3152-4d7b-bfb3-b4f3e04bb7a8-scripts\") pod \"placement-64688bf4db-vwnwg\" (UID: \"11aab908-3152-4d7b-bfb3-b4f3e04bb7a8\") " pod="openstack/placement-64688bf4db-vwnwg" Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.687153 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11aab908-3152-4d7b-bfb3-b4f3e04bb7a8-logs\") pod \"placement-64688bf4db-vwnwg\" (UID: \"11aab908-3152-4d7b-bfb3-b4f3e04bb7a8\") " pod="openstack/placement-64688bf4db-vwnwg" Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.690655 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/11aab908-3152-4d7b-bfb3-b4f3e04bb7a8-internal-tls-certs\") pod \"placement-64688bf4db-vwnwg\" (UID: \"11aab908-3152-4d7b-bfb3-b4f3e04bb7a8\") " pod="openstack/placement-64688bf4db-vwnwg" Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.690998 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11aab908-3152-4d7b-bfb3-b4f3e04bb7a8-config-data\") pod \"placement-64688bf4db-vwnwg\" (UID: \"11aab908-3152-4d7b-bfb3-b4f3e04bb7a8\") " pod="openstack/placement-64688bf4db-vwnwg" Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.692216 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11aab908-3152-4d7b-bfb3-b4f3e04bb7a8-scripts\") pod \"placement-64688bf4db-vwnwg\" (UID: \"11aab908-3152-4d7b-bfb3-b4f3e04bb7a8\") " pod="openstack/placement-64688bf4db-vwnwg" Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.692770 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11aab908-3152-4d7b-bfb3-b4f3e04bb7a8-combined-ca-bundle\") pod \"placement-64688bf4db-vwnwg\" (UID: \"11aab908-3152-4d7b-bfb3-b4f3e04bb7a8\") " pod="openstack/placement-64688bf4db-vwnwg" Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.693037 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/11aab908-3152-4d7b-bfb3-b4f3e04bb7a8-public-tls-certs\") pod \"placement-64688bf4db-vwnwg\" (UID: \"11aab908-3152-4d7b-bfb3-b4f3e04bb7a8\") " pod="openstack/placement-64688bf4db-vwnwg" Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.704639 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gz8kn\" (UniqueName: \"kubernetes.io/projected/11aab908-3152-4d7b-bfb3-b4f3e04bb7a8-kube-api-access-gz8kn\") pod \"placement-64688bf4db-vwnwg\" (UID: \"11aab908-3152-4d7b-bfb3-b4f3e04bb7a8\") " pod="openstack/placement-64688bf4db-vwnwg" Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.713296 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.724346 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 22 04:27:53 crc kubenswrapper[4699]: I1122 04:27:53.752592 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-64688bf4db-vwnwg" Nov 22 04:27:54 crc kubenswrapper[4699]: W1122 04:27:54.196134 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11aab908_3152_4d7b_bfb3_b4f3e04bb7a8.slice/crio-23a5c25d5c4ccfad389804f94e6d57c7ac9a44f4097e66214d652daa9632cc0c WatchSource:0}: Error finding container 23a5c25d5c4ccfad389804f94e6d57c7ac9a44f4097e66214d652daa9632cc0c: Status 404 returned error can't find the container with id 23a5c25d5c4ccfad389804f94e6d57c7ac9a44f4097e66214d652daa9632cc0c Nov 22 04:27:54 crc kubenswrapper[4699]: I1122 04:27:54.202032 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-64688bf4db-vwnwg"] Nov 22 04:27:54 crc kubenswrapper[4699]: I1122 04:27:54.206134 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"058a0faa-f3ce-4c0e-b7f0-e3f915f5d887","Type":"ContainerStarted","Data":"525cca3afb005e5f7a914dc11e05a682eca50a4d4bc411482d0e8680e6f7729c"} Nov 22 04:27:54 crc kubenswrapper[4699]: I1122 04:27:54.206698 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 22 04:27:54 crc kubenswrapper[4699]: I1122 04:27:54.206728 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 22 04:27:54 crc kubenswrapper[4699]: I1122 04:27:54.206741 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 22 04:27:54 crc kubenswrapper[4699]: I1122 04:27:54.206751 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 22 04:27:54 crc kubenswrapper[4699]: I1122 04:27:54.207125 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="058a0faa-f3ce-4c0e-b7f0-e3f915f5d887" containerName="ceilometer-central-agent" containerID="cri-o://9134c741b6b4714ede5a4c6025c9bb45805c0e6438ba8f14898d5b13c7824107" gracePeriod=30 Nov 22 04:27:54 crc kubenswrapper[4699]: I1122 04:27:54.207569 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="058a0faa-f3ce-4c0e-b7f0-e3f915f5d887" containerName="proxy-httpd" containerID="cri-o://525cca3afb005e5f7a914dc11e05a682eca50a4d4bc411482d0e8680e6f7729c" gracePeriod=30 Nov 22 04:27:54 crc kubenswrapper[4699]: I1122 04:27:54.207641 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="058a0faa-f3ce-4c0e-b7f0-e3f915f5d887" containerName="sg-core" containerID="cri-o://fd90a82aa572442ced06c826d38b474d6390c6e7d82f45a29dae451b3980c5f5" gracePeriod=30 Nov 22 04:27:54 crc kubenswrapper[4699]: I1122 04:27:54.207697 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="058a0faa-f3ce-4c0e-b7f0-e3f915f5d887" containerName="ceilometer-notification-agent" containerID="cri-o://d21f1d725f135279f9e93a053c515bdc28bb1fdf2ccefb2b4ce030a5416640ad" gracePeriod=30 Nov 22 04:27:54 crc kubenswrapper[4699]: I1122 04:27:54.236225 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.12684073 podStartE2EDuration="1m21.23581951s" podCreationTimestamp="2025-11-22 04:26:33 +0000 UTC" firstStartedPulling="2025-11-22 04:26:34.950557231 +0000 UTC m=+1146.293178418" lastFinishedPulling="2025-11-22 04:27:53.059536011 +0000 UTC m=+1224.402157198" observedRunningTime="2025-11-22 04:27:54.23088525 +0000 UTC m=+1225.573506447" watchObservedRunningTime="2025-11-22 04:27:54.23581951 +0000 UTC m=+1225.578440697" Nov 22 04:27:55 crc kubenswrapper[4699]: I1122 04:27:55.216208 4699 generic.go:334] "Generic (PLEG): container finished" podID="058a0faa-f3ce-4c0e-b7f0-e3f915f5d887" containerID="525cca3afb005e5f7a914dc11e05a682eca50a4d4bc411482d0e8680e6f7729c" exitCode=0 Nov 22 04:27:55 crc kubenswrapper[4699]: I1122 04:27:55.216506 4699 generic.go:334] "Generic (PLEG): container finished" podID="058a0faa-f3ce-4c0e-b7f0-e3f915f5d887" containerID="fd90a82aa572442ced06c826d38b474d6390c6e7d82f45a29dae451b3980c5f5" exitCode=2 Nov 22 04:27:55 crc kubenswrapper[4699]: I1122 04:27:55.216516 4699 generic.go:334] "Generic (PLEG): container finished" podID="058a0faa-f3ce-4c0e-b7f0-e3f915f5d887" containerID="9134c741b6b4714ede5a4c6025c9bb45805c0e6438ba8f14898d5b13c7824107" exitCode=0 Nov 22 04:27:55 crc kubenswrapper[4699]: I1122 04:27:55.216254 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"058a0faa-f3ce-4c0e-b7f0-e3f915f5d887","Type":"ContainerDied","Data":"525cca3afb005e5f7a914dc11e05a682eca50a4d4bc411482d0e8680e6f7729c"} Nov 22 04:27:55 crc kubenswrapper[4699]: I1122 04:27:55.216570 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"058a0faa-f3ce-4c0e-b7f0-e3f915f5d887","Type":"ContainerDied","Data":"fd90a82aa572442ced06c826d38b474d6390c6e7d82f45a29dae451b3980c5f5"} Nov 22 04:27:55 crc kubenswrapper[4699]: I1122 04:27:55.216582 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"058a0faa-f3ce-4c0e-b7f0-e3f915f5d887","Type":"ContainerDied","Data":"9134c741b6b4714ede5a4c6025c9bb45805c0e6438ba8f14898d5b13c7824107"} Nov 22 04:27:55 crc kubenswrapper[4699]: I1122 04:27:55.218747 4699 generic.go:334] "Generic (PLEG): container finished" podID="a79a788b-1b1c-45df-9c90-3c30d382691b" containerID="8d7efbd273318955f317598578a519fd4d002838c29a6f9db5a0fec055aba67f" exitCode=0 Nov 22 04:27:55 crc kubenswrapper[4699]: I1122 04:27:55.218827 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mf25h" event={"ID":"a79a788b-1b1c-45df-9c90-3c30d382691b","Type":"ContainerDied","Data":"8d7efbd273318955f317598578a519fd4d002838c29a6f9db5a0fec055aba67f"} Nov 22 04:27:55 crc kubenswrapper[4699]: I1122 04:27:55.220707 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-64688bf4db-vwnwg" event={"ID":"11aab908-3152-4d7b-bfb3-b4f3e04bb7a8","Type":"ContainerStarted","Data":"207a168addc65f614b04d61192bfe61b7defed42653b53a93a99a11bf1312da4"} Nov 22 04:27:55 crc kubenswrapper[4699]: I1122 04:27:55.220736 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-64688bf4db-vwnwg" event={"ID":"11aab908-3152-4d7b-bfb3-b4f3e04bb7a8","Type":"ContainerStarted","Data":"876c6012bfa266c52d3058d21dfa541b294f3bf0b9c9503908c6b00a4986f2d2"} Nov 22 04:27:55 crc kubenswrapper[4699]: I1122 04:27:55.220748 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-64688bf4db-vwnwg" event={"ID":"11aab908-3152-4d7b-bfb3-b4f3e04bb7a8","Type":"ContainerStarted","Data":"23a5c25d5c4ccfad389804f94e6d57c7ac9a44f4097e66214d652daa9632cc0c"} Nov 22 04:27:55 crc kubenswrapper[4699]: I1122 04:27:55.221209 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-64688bf4db-vwnwg" Nov 22 04:27:55 crc kubenswrapper[4699]: I1122 04:27:55.265344 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-64688bf4db-vwnwg" podStartSLOduration=2.265318259 podStartE2EDuration="2.265318259s" podCreationTimestamp="2025-11-22 04:27:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:27:55.262267905 +0000 UTC m=+1226.604889102" watchObservedRunningTime="2025-11-22 04:27:55.265318259 +0000 UTC m=+1226.607939446" Nov 22 04:27:56 crc kubenswrapper[4699]: I1122 04:27:56.229925 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-64688bf4db-vwnwg" Nov 22 04:27:56 crc kubenswrapper[4699]: I1122 04:27:56.247875 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 22 04:27:56 crc kubenswrapper[4699]: I1122 04:27:56.248042 4699 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 22 04:27:56 crc kubenswrapper[4699]: I1122 04:27:56.250121 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 22 04:27:56 crc kubenswrapper[4699]: I1122 04:27:56.302077 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 22 04:27:56 crc kubenswrapper[4699]: I1122 04:27:56.302210 4699 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 22 04:27:56 crc kubenswrapper[4699]: I1122 04:27:56.311394 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 22 04:27:56 crc kubenswrapper[4699]: I1122 04:27:56.674655 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mf25h" Nov 22 04:27:56 crc kubenswrapper[4699]: I1122 04:27:56.842344 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a79a788b-1b1c-45df-9c90-3c30d382691b-combined-ca-bundle\") pod \"a79a788b-1b1c-45df-9c90-3c30d382691b\" (UID: \"a79a788b-1b1c-45df-9c90-3c30d382691b\") " Nov 22 04:27:56 crc kubenswrapper[4699]: I1122 04:27:56.842983 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a79a788b-1b1c-45df-9c90-3c30d382691b-db-sync-config-data\") pod \"a79a788b-1b1c-45df-9c90-3c30d382691b\" (UID: \"a79a788b-1b1c-45df-9c90-3c30d382691b\") " Nov 22 04:27:56 crc kubenswrapper[4699]: I1122 04:27:56.843188 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qljtl\" (UniqueName: \"kubernetes.io/projected/a79a788b-1b1c-45df-9c90-3c30d382691b-kube-api-access-qljtl\") pod \"a79a788b-1b1c-45df-9c90-3c30d382691b\" (UID: \"a79a788b-1b1c-45df-9c90-3c30d382691b\") " Nov 22 04:27:56 crc kubenswrapper[4699]: I1122 04:27:56.854579 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a79a788b-1b1c-45df-9c90-3c30d382691b-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a79a788b-1b1c-45df-9c90-3c30d382691b" (UID: "a79a788b-1b1c-45df-9c90-3c30d382691b"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:27:56 crc kubenswrapper[4699]: I1122 04:27:56.858698 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a79a788b-1b1c-45df-9c90-3c30d382691b-kube-api-access-qljtl" (OuterVolumeSpecName: "kube-api-access-qljtl") pod "a79a788b-1b1c-45df-9c90-3c30d382691b" (UID: "a79a788b-1b1c-45df-9c90-3c30d382691b"). InnerVolumeSpecName "kube-api-access-qljtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:27:56 crc kubenswrapper[4699]: I1122 04:27:56.879892 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a79a788b-1b1c-45df-9c90-3c30d382691b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a79a788b-1b1c-45df-9c90-3c30d382691b" (UID: "a79a788b-1b1c-45df-9c90-3c30d382691b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:27:56 crc kubenswrapper[4699]: I1122 04:27:56.945481 4699 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a79a788b-1b1c-45df-9c90-3c30d382691b-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:27:56 crc kubenswrapper[4699]: I1122 04:27:56.945523 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qljtl\" (UniqueName: \"kubernetes.io/projected/a79a788b-1b1c-45df-9c90-3c30d382691b-kube-api-access-qljtl\") on node \"crc\" DevicePath \"\"" Nov 22 04:27:56 crc kubenswrapper[4699]: I1122 04:27:56.945537 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a79a788b-1b1c-45df-9c90-3c30d382691b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:27:57 crc kubenswrapper[4699]: I1122 04:27:57.241560 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mf25h" Nov 22 04:27:57 crc kubenswrapper[4699]: I1122 04:27:57.241568 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mf25h" event={"ID":"a79a788b-1b1c-45df-9c90-3c30d382691b","Type":"ContainerDied","Data":"f1f8dc05102539581467cfc1e4f27a6a37f142fa43b9e81ceb746a7f002cc336"} Nov 22 04:27:57 crc kubenswrapper[4699]: I1122 04:27:57.242567 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1f8dc05102539581467cfc1e4f27a6a37f142fa43b9e81ceb746a7f002cc336" Nov 22 04:27:57 crc kubenswrapper[4699]: I1122 04:27:57.480170 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-85df448b85-c7qlg"] Nov 22 04:27:57 crc kubenswrapper[4699]: E1122 04:27:57.480637 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a79a788b-1b1c-45df-9c90-3c30d382691b" containerName="barbican-db-sync" Nov 22 04:27:57 crc kubenswrapper[4699]: I1122 04:27:57.480657 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="a79a788b-1b1c-45df-9c90-3c30d382691b" containerName="barbican-db-sync" Nov 22 04:27:57 crc kubenswrapper[4699]: I1122 04:27:57.480884 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="a79a788b-1b1c-45df-9c90-3c30d382691b" containerName="barbican-db-sync" Nov 22 04:27:57 crc kubenswrapper[4699]: I1122 04:27:57.493649 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-85df448b85-c7qlg" Nov 22 04:27:57 crc kubenswrapper[4699]: I1122 04:27:57.509381 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Nov 22 04:27:57 crc kubenswrapper[4699]: I1122 04:27:57.509806 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-gn46q" Nov 22 04:27:57 crc kubenswrapper[4699]: I1122 04:27:57.509979 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Nov 22 04:27:57 crc kubenswrapper[4699]: I1122 04:27:57.517934 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-85df448b85-c7qlg"] Nov 22 04:27:57 crc kubenswrapper[4699]: I1122 04:27:57.559898 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-78b588d944-t7d25"] Nov 22 04:27:57 crc kubenswrapper[4699]: I1122 04:27:57.561339 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-78b588d944-t7d25" Nov 22 04:27:57 crc kubenswrapper[4699]: I1122 04:27:57.575295 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Nov 22 04:27:57 crc kubenswrapper[4699]: I1122 04:27:57.637536 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-78b588d944-t7d25"] Nov 22 04:27:57 crc kubenswrapper[4699]: I1122 04:27:57.667489 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6434b63e-cd0f-4cc2-aa3e-463cbf9e7800-config-data-custom\") pod \"barbican-worker-85df448b85-c7qlg\" (UID: \"6434b63e-cd0f-4cc2-aa3e-463cbf9e7800\") " pod="openstack/barbican-worker-85df448b85-c7qlg" Nov 22 04:27:57 crc kubenswrapper[4699]: I1122 04:27:57.667554 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6434b63e-cd0f-4cc2-aa3e-463cbf9e7800-combined-ca-bundle\") pod \"barbican-worker-85df448b85-c7qlg\" (UID: \"6434b63e-cd0f-4cc2-aa3e-463cbf9e7800\") " pod="openstack/barbican-worker-85df448b85-c7qlg" Nov 22 04:27:57 crc kubenswrapper[4699]: I1122 04:27:57.667622 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jrfq\" (UniqueName: \"kubernetes.io/projected/6434b63e-cd0f-4cc2-aa3e-463cbf9e7800-kube-api-access-4jrfq\") pod \"barbican-worker-85df448b85-c7qlg\" (UID: \"6434b63e-cd0f-4cc2-aa3e-463cbf9e7800\") " pod="openstack/barbican-worker-85df448b85-c7qlg" Nov 22 04:27:57 crc kubenswrapper[4699]: I1122 04:27:57.667661 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da5bf8fa-2592-445a-acfc-56e044b4291c-config-data-custom\") pod \"barbican-keystone-listener-78b588d944-t7d25\" (UID: \"da5bf8fa-2592-445a-acfc-56e044b4291c\") " pod="openstack/barbican-keystone-listener-78b588d944-t7d25" Nov 22 04:27:57 crc kubenswrapper[4699]: I1122 04:27:57.667692 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da5bf8fa-2592-445a-acfc-56e044b4291c-config-data\") pod \"barbican-keystone-listener-78b588d944-t7d25\" (UID: \"da5bf8fa-2592-445a-acfc-56e044b4291c\") " pod="openstack/barbican-keystone-listener-78b588d944-t7d25" Nov 22 04:27:57 crc kubenswrapper[4699]: I1122 04:27:57.667960 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6434b63e-cd0f-4cc2-aa3e-463cbf9e7800-logs\") pod \"barbican-worker-85df448b85-c7qlg\" (UID: \"6434b63e-cd0f-4cc2-aa3e-463cbf9e7800\") " pod="openstack/barbican-worker-85df448b85-c7qlg" Nov 22 04:27:57 crc kubenswrapper[4699]: I1122 04:27:57.667997 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpvzb\" (UniqueName: \"kubernetes.io/projected/da5bf8fa-2592-445a-acfc-56e044b4291c-kube-api-access-wpvzb\") pod \"barbican-keystone-listener-78b588d944-t7d25\" (UID: \"da5bf8fa-2592-445a-acfc-56e044b4291c\") " pod="openstack/barbican-keystone-listener-78b588d944-t7d25" Nov 22 04:27:57 crc kubenswrapper[4699]: I1122 04:27:57.668027 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da5bf8fa-2592-445a-acfc-56e044b4291c-combined-ca-bundle\") pod \"barbican-keystone-listener-78b588d944-t7d25\" (UID: \"da5bf8fa-2592-445a-acfc-56e044b4291c\") " pod="openstack/barbican-keystone-listener-78b588d944-t7d25" Nov 22 04:27:57 crc kubenswrapper[4699]: I1122 04:27:57.668053 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6434b63e-cd0f-4cc2-aa3e-463cbf9e7800-config-data\") pod \"barbican-worker-85df448b85-c7qlg\" (UID: \"6434b63e-cd0f-4cc2-aa3e-463cbf9e7800\") " pod="openstack/barbican-worker-85df448b85-c7qlg" Nov 22 04:27:57 crc kubenswrapper[4699]: I1122 04:27:57.668072 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da5bf8fa-2592-445a-acfc-56e044b4291c-logs\") pod \"barbican-keystone-listener-78b588d944-t7d25\" (UID: \"da5bf8fa-2592-445a-acfc-56e044b4291c\") " pod="openstack/barbican-keystone-listener-78b588d944-t7d25" Nov 22 04:27:57 crc kubenswrapper[4699]: I1122 04:27:57.773541 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpvzb\" (UniqueName: \"kubernetes.io/projected/da5bf8fa-2592-445a-acfc-56e044b4291c-kube-api-access-wpvzb\") pod \"barbican-keystone-listener-78b588d944-t7d25\" (UID: \"da5bf8fa-2592-445a-acfc-56e044b4291c\") " pod="openstack/barbican-keystone-listener-78b588d944-t7d25" Nov 22 04:27:57 crc kubenswrapper[4699]: I1122 04:27:57.773609 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da5bf8fa-2592-445a-acfc-56e044b4291c-combined-ca-bundle\") pod \"barbican-keystone-listener-78b588d944-t7d25\" (UID: \"da5bf8fa-2592-445a-acfc-56e044b4291c\") " pod="openstack/barbican-keystone-listener-78b588d944-t7d25" Nov 22 04:27:57 crc kubenswrapper[4699]: I1122 04:27:57.773642 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6434b63e-cd0f-4cc2-aa3e-463cbf9e7800-config-data\") pod \"barbican-worker-85df448b85-c7qlg\" (UID: \"6434b63e-cd0f-4cc2-aa3e-463cbf9e7800\") " pod="openstack/barbican-worker-85df448b85-c7qlg" Nov 22 04:27:57 crc kubenswrapper[4699]: I1122 04:27:57.773669 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da5bf8fa-2592-445a-acfc-56e044b4291c-logs\") pod \"barbican-keystone-listener-78b588d944-t7d25\" (UID: \"da5bf8fa-2592-445a-acfc-56e044b4291c\") " pod="openstack/barbican-keystone-listener-78b588d944-t7d25" Nov 22 04:27:57 crc kubenswrapper[4699]: I1122 04:27:57.773702 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6434b63e-cd0f-4cc2-aa3e-463cbf9e7800-config-data-custom\") pod \"barbican-worker-85df448b85-c7qlg\" (UID: \"6434b63e-cd0f-4cc2-aa3e-463cbf9e7800\") " pod="openstack/barbican-worker-85df448b85-c7qlg" Nov 22 04:27:57 crc kubenswrapper[4699]: I1122 04:27:57.773719 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6434b63e-cd0f-4cc2-aa3e-463cbf9e7800-combined-ca-bundle\") pod \"barbican-worker-85df448b85-c7qlg\" (UID: \"6434b63e-cd0f-4cc2-aa3e-463cbf9e7800\") " pod="openstack/barbican-worker-85df448b85-c7qlg" Nov 22 04:27:57 crc kubenswrapper[4699]: I1122 04:27:57.773762 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jrfq\" (UniqueName: \"kubernetes.io/projected/6434b63e-cd0f-4cc2-aa3e-463cbf9e7800-kube-api-access-4jrfq\") pod \"barbican-worker-85df448b85-c7qlg\" (UID: \"6434b63e-cd0f-4cc2-aa3e-463cbf9e7800\") " pod="openstack/barbican-worker-85df448b85-c7qlg" Nov 22 04:27:57 crc kubenswrapper[4699]: I1122 04:27:57.773782 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da5bf8fa-2592-445a-acfc-56e044b4291c-config-data-custom\") pod \"barbican-keystone-listener-78b588d944-t7d25\" (UID: \"da5bf8fa-2592-445a-acfc-56e044b4291c\") " pod="openstack/barbican-keystone-listener-78b588d944-t7d25" Nov 22 04:27:57 crc kubenswrapper[4699]: I1122 04:27:57.773807 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da5bf8fa-2592-445a-acfc-56e044b4291c-config-data\") pod \"barbican-keystone-listener-78b588d944-t7d25\" (UID: \"da5bf8fa-2592-445a-acfc-56e044b4291c\") " pod="openstack/barbican-keystone-listener-78b588d944-t7d25" Nov 22 04:27:57 crc kubenswrapper[4699]: I1122 04:27:57.773871 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6434b63e-cd0f-4cc2-aa3e-463cbf9e7800-logs\") pod \"barbican-worker-85df448b85-c7qlg\" (UID: \"6434b63e-cd0f-4cc2-aa3e-463cbf9e7800\") " pod="openstack/barbican-worker-85df448b85-c7qlg" Nov 22 04:27:57 crc kubenswrapper[4699]: I1122 04:27:57.774383 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6434b63e-cd0f-4cc2-aa3e-463cbf9e7800-logs\") pod \"barbican-worker-85df448b85-c7qlg\" (UID: \"6434b63e-cd0f-4cc2-aa3e-463cbf9e7800\") " pod="openstack/barbican-worker-85df448b85-c7qlg" Nov 22 04:27:57 crc kubenswrapper[4699]: I1122 04:27:57.795882 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da5bf8fa-2592-445a-acfc-56e044b4291c-config-data-custom\") pod \"barbican-keystone-listener-78b588d944-t7d25\" (UID: \"da5bf8fa-2592-445a-acfc-56e044b4291c\") " pod="openstack/barbican-keystone-listener-78b588d944-t7d25" Nov 22 04:27:57 crc kubenswrapper[4699]: I1122 04:27:57.796449 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da5bf8fa-2592-445a-acfc-56e044b4291c-logs\") pod \"barbican-keystone-listener-78b588d944-t7d25\" (UID: \"da5bf8fa-2592-445a-acfc-56e044b4291c\") " pod="openstack/barbican-keystone-listener-78b588d944-t7d25" Nov 22 04:27:57 crc kubenswrapper[4699]: I1122 04:27:57.812984 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6434b63e-cd0f-4cc2-aa3e-463cbf9e7800-config-data-custom\") pod \"barbican-worker-85df448b85-c7qlg\" (UID: \"6434b63e-cd0f-4cc2-aa3e-463cbf9e7800\") " pod="openstack/barbican-worker-85df448b85-c7qlg" Nov 22 04:27:57 crc kubenswrapper[4699]: I1122 04:27:57.813044 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpvzb\" (UniqueName: \"kubernetes.io/projected/da5bf8fa-2592-445a-acfc-56e044b4291c-kube-api-access-wpvzb\") pod \"barbican-keystone-listener-78b588d944-t7d25\" (UID: \"da5bf8fa-2592-445a-acfc-56e044b4291c\") " pod="openstack/barbican-keystone-listener-78b588d944-t7d25" Nov 22 04:27:57 crc kubenswrapper[4699]: I1122 04:27:57.817741 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da5bf8fa-2592-445a-acfc-56e044b4291c-config-data\") pod \"barbican-keystone-listener-78b588d944-t7d25\" (UID: \"da5bf8fa-2592-445a-acfc-56e044b4291c\") " pod="openstack/barbican-keystone-listener-78b588d944-t7d25" Nov 22 04:27:57 crc kubenswrapper[4699]: I1122 04:27:57.818293 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6434b63e-cd0f-4cc2-aa3e-463cbf9e7800-combined-ca-bundle\") pod \"barbican-worker-85df448b85-c7qlg\" (UID: \"6434b63e-cd0f-4cc2-aa3e-463cbf9e7800\") " pod="openstack/barbican-worker-85df448b85-c7qlg" Nov 22 04:27:57 crc kubenswrapper[4699]: I1122 04:27:57.818866 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6434b63e-cd0f-4cc2-aa3e-463cbf9e7800-config-data\") pod \"barbican-worker-85df448b85-c7qlg\" (UID: \"6434b63e-cd0f-4cc2-aa3e-463cbf9e7800\") " pod="openstack/barbican-worker-85df448b85-c7qlg" Nov 22 04:27:57 crc kubenswrapper[4699]: I1122 04:27:57.831148 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jrfq\" (UniqueName: \"kubernetes.io/projected/6434b63e-cd0f-4cc2-aa3e-463cbf9e7800-kube-api-access-4jrfq\") pod \"barbican-worker-85df448b85-c7qlg\" (UID: \"6434b63e-cd0f-4cc2-aa3e-463cbf9e7800\") " pod="openstack/barbican-worker-85df448b85-c7qlg" Nov 22 04:27:57 crc kubenswrapper[4699]: I1122 04:27:57.848567 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da5bf8fa-2592-445a-acfc-56e044b4291c-combined-ca-bundle\") pod \"barbican-keystone-listener-78b588d944-t7d25\" (UID: \"da5bf8fa-2592-445a-acfc-56e044b4291c\") " pod="openstack/barbican-keystone-listener-78b588d944-t7d25" Nov 22 04:27:57 crc kubenswrapper[4699]: I1122 04:27:57.868633 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-85df448b85-c7qlg" Nov 22 04:27:57 crc kubenswrapper[4699]: I1122 04:27:57.881653 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-f9dfc"] Nov 22 04:27:57 crc kubenswrapper[4699]: I1122 04:27:57.924237 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-f9dfc" Nov 22 04:27:58 crc kubenswrapper[4699]: I1122 04:27:58.007778 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-f9dfc"] Nov 22 04:27:58 crc kubenswrapper[4699]: I1122 04:27:58.026946 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-76d968d474-5l6z2"] Nov 22 04:27:58 crc kubenswrapper[4699]: I1122 04:27:58.033278 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-78b588d944-t7d25" Nov 22 04:27:58 crc kubenswrapper[4699]: I1122 04:27:58.035779 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-76d968d474-5l6z2" Nov 22 04:27:58 crc kubenswrapper[4699]: I1122 04:27:58.039641 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Nov 22 04:27:58 crc kubenswrapper[4699]: I1122 04:27:58.042596 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-76d968d474-5l6z2"] Nov 22 04:27:58 crc kubenswrapper[4699]: I1122 04:27:58.046304 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2036e588-c3af-438b-9d44-6f77609be731-dns-svc\") pod \"dnsmasq-dns-586bdc5f9-f9dfc\" (UID: \"2036e588-c3af-438b-9d44-6f77609be731\") " pod="openstack/dnsmasq-dns-586bdc5f9-f9dfc" Nov 22 04:27:58 crc kubenswrapper[4699]: I1122 04:27:58.046450 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2036e588-c3af-438b-9d44-6f77609be731-ovsdbserver-sb\") pod \"dnsmasq-dns-586bdc5f9-f9dfc\" (UID: \"2036e588-c3af-438b-9d44-6f77609be731\") " pod="openstack/dnsmasq-dns-586bdc5f9-f9dfc" Nov 22 04:27:58 crc kubenswrapper[4699]: I1122 04:27:58.046481 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2036e588-c3af-438b-9d44-6f77609be731-config\") pod \"dnsmasq-dns-586bdc5f9-f9dfc\" (UID: \"2036e588-c3af-438b-9d44-6f77609be731\") " pod="openstack/dnsmasq-dns-586bdc5f9-f9dfc" Nov 22 04:27:58 crc kubenswrapper[4699]: I1122 04:27:58.046534 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2036e588-c3af-438b-9d44-6f77609be731-dns-swift-storage-0\") pod \"dnsmasq-dns-586bdc5f9-f9dfc\" (UID: \"2036e588-c3af-438b-9d44-6f77609be731\") " pod="openstack/dnsmasq-dns-586bdc5f9-f9dfc" Nov 22 04:27:58 crc kubenswrapper[4699]: I1122 04:27:58.046566 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t745z\" (UniqueName: \"kubernetes.io/projected/2036e588-c3af-438b-9d44-6f77609be731-kube-api-access-t745z\") pod \"dnsmasq-dns-586bdc5f9-f9dfc\" (UID: \"2036e588-c3af-438b-9d44-6f77609be731\") " pod="openstack/dnsmasq-dns-586bdc5f9-f9dfc" Nov 22 04:27:58 crc kubenswrapper[4699]: I1122 04:27:58.046599 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2036e588-c3af-438b-9d44-6f77609be731-ovsdbserver-nb\") pod \"dnsmasq-dns-586bdc5f9-f9dfc\" (UID: \"2036e588-c3af-438b-9d44-6f77609be731\") " pod="openstack/dnsmasq-dns-586bdc5f9-f9dfc" Nov 22 04:27:58 crc kubenswrapper[4699]: I1122 04:27:58.149620 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe-combined-ca-bundle\") pod \"barbican-api-76d968d474-5l6z2\" (UID: \"eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe\") " pod="openstack/barbican-api-76d968d474-5l6z2" Nov 22 04:27:58 crc kubenswrapper[4699]: I1122 04:27:58.149684 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe-config-data-custom\") pod \"barbican-api-76d968d474-5l6z2\" (UID: \"eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe\") " pod="openstack/barbican-api-76d968d474-5l6z2" Nov 22 04:27:58 crc kubenswrapper[4699]: I1122 04:27:58.149737 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe-logs\") pod \"barbican-api-76d968d474-5l6z2\" (UID: \"eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe\") " pod="openstack/barbican-api-76d968d474-5l6z2" Nov 22 04:27:58 crc kubenswrapper[4699]: I1122 04:27:58.149786 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2036e588-c3af-438b-9d44-6f77609be731-dns-swift-storage-0\") pod \"dnsmasq-dns-586bdc5f9-f9dfc\" (UID: \"2036e588-c3af-438b-9d44-6f77609be731\") " pod="openstack/dnsmasq-dns-586bdc5f9-f9dfc" Nov 22 04:27:58 crc kubenswrapper[4699]: I1122 04:27:58.149836 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t745z\" (UniqueName: \"kubernetes.io/projected/2036e588-c3af-438b-9d44-6f77609be731-kube-api-access-t745z\") pod \"dnsmasq-dns-586bdc5f9-f9dfc\" (UID: \"2036e588-c3af-438b-9d44-6f77609be731\") " pod="openstack/dnsmasq-dns-586bdc5f9-f9dfc" Nov 22 04:27:58 crc kubenswrapper[4699]: I1122 04:27:58.149875 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2036e588-c3af-438b-9d44-6f77609be731-ovsdbserver-nb\") pod \"dnsmasq-dns-586bdc5f9-f9dfc\" (UID: \"2036e588-c3af-438b-9d44-6f77609be731\") " pod="openstack/dnsmasq-dns-586bdc5f9-f9dfc" Nov 22 04:27:58 crc kubenswrapper[4699]: I1122 04:27:58.149926 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hd57\" (UniqueName: \"kubernetes.io/projected/eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe-kube-api-access-5hd57\") pod \"barbican-api-76d968d474-5l6z2\" (UID: \"eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe\") " pod="openstack/barbican-api-76d968d474-5l6z2" Nov 22 04:27:58 crc kubenswrapper[4699]: I1122 04:27:58.151165 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2036e588-c3af-438b-9d44-6f77609be731-ovsdbserver-nb\") pod \"dnsmasq-dns-586bdc5f9-f9dfc\" (UID: \"2036e588-c3af-438b-9d44-6f77609be731\") " pod="openstack/dnsmasq-dns-586bdc5f9-f9dfc" Nov 22 04:27:58 crc kubenswrapper[4699]: I1122 04:27:58.151216 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2036e588-c3af-438b-9d44-6f77609be731-dns-swift-storage-0\") pod \"dnsmasq-dns-586bdc5f9-f9dfc\" (UID: \"2036e588-c3af-438b-9d44-6f77609be731\") " pod="openstack/dnsmasq-dns-586bdc5f9-f9dfc" Nov 22 04:27:58 crc kubenswrapper[4699]: I1122 04:27:58.153507 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2036e588-c3af-438b-9d44-6f77609be731-dns-svc\") pod \"dnsmasq-dns-586bdc5f9-f9dfc\" (UID: \"2036e588-c3af-438b-9d44-6f77609be731\") " pod="openstack/dnsmasq-dns-586bdc5f9-f9dfc" Nov 22 04:27:58 crc kubenswrapper[4699]: I1122 04:27:58.153793 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe-config-data\") pod \"barbican-api-76d968d474-5l6z2\" (UID: \"eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe\") " pod="openstack/barbican-api-76d968d474-5l6z2" Nov 22 04:27:58 crc kubenswrapper[4699]: I1122 04:27:58.153838 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2036e588-c3af-438b-9d44-6f77609be731-ovsdbserver-sb\") pod \"dnsmasq-dns-586bdc5f9-f9dfc\" (UID: \"2036e588-c3af-438b-9d44-6f77609be731\") " pod="openstack/dnsmasq-dns-586bdc5f9-f9dfc" Nov 22 04:27:58 crc kubenswrapper[4699]: I1122 04:27:58.153898 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2036e588-c3af-438b-9d44-6f77609be731-config\") pod \"dnsmasq-dns-586bdc5f9-f9dfc\" (UID: \"2036e588-c3af-438b-9d44-6f77609be731\") " pod="openstack/dnsmasq-dns-586bdc5f9-f9dfc" Nov 22 04:27:58 crc kubenswrapper[4699]: I1122 04:27:58.154280 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2036e588-c3af-438b-9d44-6f77609be731-dns-svc\") pod \"dnsmasq-dns-586bdc5f9-f9dfc\" (UID: \"2036e588-c3af-438b-9d44-6f77609be731\") " pod="openstack/dnsmasq-dns-586bdc5f9-f9dfc" Nov 22 04:27:58 crc kubenswrapper[4699]: I1122 04:27:58.155083 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2036e588-c3af-438b-9d44-6f77609be731-ovsdbserver-sb\") pod \"dnsmasq-dns-586bdc5f9-f9dfc\" (UID: \"2036e588-c3af-438b-9d44-6f77609be731\") " pod="openstack/dnsmasq-dns-586bdc5f9-f9dfc" Nov 22 04:27:58 crc kubenswrapper[4699]: I1122 04:27:58.155145 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2036e588-c3af-438b-9d44-6f77609be731-config\") pod \"dnsmasq-dns-586bdc5f9-f9dfc\" (UID: \"2036e588-c3af-438b-9d44-6f77609be731\") " pod="openstack/dnsmasq-dns-586bdc5f9-f9dfc" Nov 22 04:27:58 crc kubenswrapper[4699]: I1122 04:27:58.177063 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t745z\" (UniqueName: \"kubernetes.io/projected/2036e588-c3af-438b-9d44-6f77609be731-kube-api-access-t745z\") pod \"dnsmasq-dns-586bdc5f9-f9dfc\" (UID: \"2036e588-c3af-438b-9d44-6f77609be731\") " pod="openstack/dnsmasq-dns-586bdc5f9-f9dfc" Nov 22 04:27:58 crc kubenswrapper[4699]: E1122 04:27:58.204186 4699 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod417b0282_cef1_4a7c_aca5_593297254fe3.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod417b0282_cef1_4a7c_aca5_593297254fe3.slice/crio-d9be3bb7fa7a0781dffe600d9a6395a984bfdc0273d8e5fcae602ef27fcca7e9\": RecentStats: unable to find data in memory cache]" Nov 22 04:27:58 crc kubenswrapper[4699]: I1122 04:27:58.257179 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hd57\" (UniqueName: \"kubernetes.io/projected/eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe-kube-api-access-5hd57\") pod \"barbican-api-76d968d474-5l6z2\" (UID: \"eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe\") " pod="openstack/barbican-api-76d968d474-5l6z2" Nov 22 04:27:58 crc kubenswrapper[4699]: I1122 04:27:58.257347 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe-config-data\") pod \"barbican-api-76d968d474-5l6z2\" (UID: \"eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe\") " pod="openstack/barbican-api-76d968d474-5l6z2" Nov 22 04:27:58 crc kubenswrapper[4699]: I1122 04:27:58.257420 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe-combined-ca-bundle\") pod \"barbican-api-76d968d474-5l6z2\" (UID: \"eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe\") " pod="openstack/barbican-api-76d968d474-5l6z2" Nov 22 04:27:58 crc kubenswrapper[4699]: I1122 04:27:58.257463 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe-config-data-custom\") pod \"barbican-api-76d968d474-5l6z2\" (UID: \"eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe\") " pod="openstack/barbican-api-76d968d474-5l6z2" Nov 22 04:27:58 crc kubenswrapper[4699]: I1122 04:27:58.257513 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe-logs\") pod \"barbican-api-76d968d474-5l6z2\" (UID: \"eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe\") " pod="openstack/barbican-api-76d968d474-5l6z2" Nov 22 04:27:58 crc kubenswrapper[4699]: I1122 04:27:58.257886 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe-logs\") pod \"barbican-api-76d968d474-5l6z2\" (UID: \"eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe\") " pod="openstack/barbican-api-76d968d474-5l6z2" Nov 22 04:27:58 crc kubenswrapper[4699]: I1122 04:27:58.273488 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe-config-data-custom\") pod \"barbican-api-76d968d474-5l6z2\" (UID: \"eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe\") " pod="openstack/barbican-api-76d968d474-5l6z2" Nov 22 04:27:58 crc kubenswrapper[4699]: I1122 04:27:58.275952 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe-combined-ca-bundle\") pod \"barbican-api-76d968d474-5l6z2\" (UID: \"eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe\") " pod="openstack/barbican-api-76d968d474-5l6z2" Nov 22 04:27:58 crc kubenswrapper[4699]: I1122 04:27:58.277148 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe-config-data\") pod \"barbican-api-76d968d474-5l6z2\" (UID: \"eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe\") " pod="openstack/barbican-api-76d968d474-5l6z2" Nov 22 04:27:58 crc kubenswrapper[4699]: I1122 04:27:58.293086 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hd57\" (UniqueName: \"kubernetes.io/projected/eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe-kube-api-access-5hd57\") pod \"barbican-api-76d968d474-5l6z2\" (UID: \"eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe\") " pod="openstack/barbican-api-76d968d474-5l6z2" Nov 22 04:27:58 crc kubenswrapper[4699]: I1122 04:27:58.340964 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-f9dfc" Nov 22 04:27:58 crc kubenswrapper[4699]: I1122 04:27:58.410251 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-76d968d474-5l6z2" Nov 22 04:27:58 crc kubenswrapper[4699]: I1122 04:27:58.437855 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-85df448b85-c7qlg"] Nov 22 04:27:58 crc kubenswrapper[4699]: W1122 04:27:58.450771 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6434b63e_cd0f_4cc2_aa3e_463cbf9e7800.slice/crio-b0b67d23eee0b6d4e6a89b935c72465aba3c9f342f53bed66802e804b59e550d WatchSource:0}: Error finding container b0b67d23eee0b6d4e6a89b935c72465aba3c9f342f53bed66802e804b59e550d: Status 404 returned error can't find the container with id b0b67d23eee0b6d4e6a89b935c72465aba3c9f342f53bed66802e804b59e550d Nov 22 04:27:58 crc kubenswrapper[4699]: I1122 04:27:58.628102 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-78b588d944-t7d25"] Nov 22 04:27:58 crc kubenswrapper[4699]: W1122 04:27:58.639505 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda5bf8fa_2592_445a_acfc_56e044b4291c.slice/crio-17a531e47b667d7f0d365938645845df74922b9bee7c10583dd94bafcb66b410 WatchSource:0}: Error finding container 17a531e47b667d7f0d365938645845df74922b9bee7c10583dd94bafcb66b410: Status 404 returned error can't find the container with id 17a531e47b667d7f0d365938645845df74922b9bee7c10583dd94bafcb66b410 Nov 22 04:27:58 crc kubenswrapper[4699]: I1122 04:27:58.639850 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-f9dfc"] Nov 22 04:27:58 crc kubenswrapper[4699]: W1122 04:27:58.655542 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2036e588_c3af_438b_9d44_6f77609be731.slice/crio-05069332dfa78bfe28378f5fa711fb68043772b38d0d366ce32eb2851ce353dd WatchSource:0}: Error finding container 05069332dfa78bfe28378f5fa711fb68043772b38d0d366ce32eb2851ce353dd: Status 404 returned error can't find the container with id 05069332dfa78bfe28378f5fa711fb68043772b38d0d366ce32eb2851ce353dd Nov 22 04:27:58 crc kubenswrapper[4699]: I1122 04:27:58.913491 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-76d968d474-5l6z2"] Nov 22 04:27:58 crc kubenswrapper[4699]: W1122 04:27:58.916253 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeea8b7c9_d91d_4267_a1ec_77fb7cf0a8fe.slice/crio-e53b82e4036a59e3fdc23e4c806462741d6eb6494984e66b259c865fdaa7a7d8 WatchSource:0}: Error finding container e53b82e4036a59e3fdc23e4c806462741d6eb6494984e66b259c865fdaa7a7d8: Status 404 returned error can't find the container with id e53b82e4036a59e3fdc23e4c806462741d6eb6494984e66b259c865fdaa7a7d8 Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.252307 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.274407 4699 generic.go:334] "Generic (PLEG): container finished" podID="2036e588-c3af-438b-9d44-6f77609be731" containerID="5c0440816663cc10dbc565ece19ea2612dff9a0d4858a3efbbe40010bc032de9" exitCode=0 Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.274481 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-f9dfc" event={"ID":"2036e588-c3af-438b-9d44-6f77609be731","Type":"ContainerDied","Data":"5c0440816663cc10dbc565ece19ea2612dff9a0d4858a3efbbe40010bc032de9"} Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.274562 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-f9dfc" event={"ID":"2036e588-c3af-438b-9d44-6f77609be731","Type":"ContainerStarted","Data":"05069332dfa78bfe28378f5fa711fb68043772b38d0d366ce32eb2851ce353dd"} Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.276757 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/058a0faa-f3ce-4c0e-b7f0-e3f915f5d887-combined-ca-bundle\") pod \"058a0faa-f3ce-4c0e-b7f0-e3f915f5d887\" (UID: \"058a0faa-f3ce-4c0e-b7f0-e3f915f5d887\") " Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.276971 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/058a0faa-f3ce-4c0e-b7f0-e3f915f5d887-config-data\") pod \"058a0faa-f3ce-4c0e-b7f0-e3f915f5d887\" (UID: \"058a0faa-f3ce-4c0e-b7f0-e3f915f5d887\") " Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.277019 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/058a0faa-f3ce-4c0e-b7f0-e3f915f5d887-log-httpd\") pod \"058a0faa-f3ce-4c0e-b7f0-e3f915f5d887\" (UID: \"058a0faa-f3ce-4c0e-b7f0-e3f915f5d887\") " Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.277071 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/058a0faa-f3ce-4c0e-b7f0-e3f915f5d887-sg-core-conf-yaml\") pod \"058a0faa-f3ce-4c0e-b7f0-e3f915f5d887\" (UID: \"058a0faa-f3ce-4c0e-b7f0-e3f915f5d887\") " Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.277172 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/058a0faa-f3ce-4c0e-b7f0-e3f915f5d887-scripts\") pod \"058a0faa-f3ce-4c0e-b7f0-e3f915f5d887\" (UID: \"058a0faa-f3ce-4c0e-b7f0-e3f915f5d887\") " Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.277196 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/058a0faa-f3ce-4c0e-b7f0-e3f915f5d887-run-httpd\") pod \"058a0faa-f3ce-4c0e-b7f0-e3f915f5d887\" (UID: \"058a0faa-f3ce-4c0e-b7f0-e3f915f5d887\") " Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.277261 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpbvs\" (UniqueName: \"kubernetes.io/projected/058a0faa-f3ce-4c0e-b7f0-e3f915f5d887-kube-api-access-qpbvs\") pod \"058a0faa-f3ce-4c0e-b7f0-e3f915f5d887\" (UID: \"058a0faa-f3ce-4c0e-b7f0-e3f915f5d887\") " Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.279640 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/058a0faa-f3ce-4c0e-b7f0-e3f915f5d887-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "058a0faa-f3ce-4c0e-b7f0-e3f915f5d887" (UID: "058a0faa-f3ce-4c0e-b7f0-e3f915f5d887"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.282592 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/058a0faa-f3ce-4c0e-b7f0-e3f915f5d887-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "058a0faa-f3ce-4c0e-b7f0-e3f915f5d887" (UID: "058a0faa-f3ce-4c0e-b7f0-e3f915f5d887"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.284878 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/058a0faa-f3ce-4c0e-b7f0-e3f915f5d887-kube-api-access-qpbvs" (OuterVolumeSpecName: "kube-api-access-qpbvs") pod "058a0faa-f3ce-4c0e-b7f0-e3f915f5d887" (UID: "058a0faa-f3ce-4c0e-b7f0-e3f915f5d887"). InnerVolumeSpecName "kube-api-access-qpbvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.288564 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-78b588d944-t7d25" event={"ID":"da5bf8fa-2592-445a-acfc-56e044b4291c","Type":"ContainerStarted","Data":"17a531e47b667d7f0d365938645845df74922b9bee7c10583dd94bafcb66b410"} Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.290943 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/058a0faa-f3ce-4c0e-b7f0-e3f915f5d887-scripts" (OuterVolumeSpecName: "scripts") pod "058a0faa-f3ce-4c0e-b7f0-e3f915f5d887" (UID: "058a0faa-f3ce-4c0e-b7f0-e3f915f5d887"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.306487 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-85df448b85-c7qlg" event={"ID":"6434b63e-cd0f-4cc2-aa3e-463cbf9e7800","Type":"ContainerStarted","Data":"b0b67d23eee0b6d4e6a89b935c72465aba3c9f342f53bed66802e804b59e550d"} Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.311516 4699 generic.go:334] "Generic (PLEG): container finished" podID="058a0faa-f3ce-4c0e-b7f0-e3f915f5d887" containerID="d21f1d725f135279f9e93a053c515bdc28bb1fdf2ccefb2b4ce030a5416640ad" exitCode=0 Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.311612 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"058a0faa-f3ce-4c0e-b7f0-e3f915f5d887","Type":"ContainerDied","Data":"d21f1d725f135279f9e93a053c515bdc28bb1fdf2ccefb2b4ce030a5416640ad"} Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.311647 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"058a0faa-f3ce-4c0e-b7f0-e3f915f5d887","Type":"ContainerDied","Data":"a0e358ebdc8c954c0d6b11d9df593c8e636eee1df6225d1c57661f184da6d392"} Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.311667 4699 scope.go:117] "RemoveContainer" containerID="525cca3afb005e5f7a914dc11e05a682eca50a4d4bc411482d0e8680e6f7729c" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.311674 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.320583 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/058a0faa-f3ce-4c0e-b7f0-e3f915f5d887-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "058a0faa-f3ce-4c0e-b7f0-e3f915f5d887" (UID: "058a0faa-f3ce-4c0e-b7f0-e3f915f5d887"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.324298 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76d968d474-5l6z2" event={"ID":"eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe","Type":"ContainerStarted","Data":"8382c65c1883c39f5a96b59663fbbf89ac9a4d17fa18641974495ad18a2a8ced"} Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.324356 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76d968d474-5l6z2" event={"ID":"eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe","Type":"ContainerStarted","Data":"e53b82e4036a59e3fdc23e4c806462741d6eb6494984e66b259c865fdaa7a7d8"} Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.349411 4699 scope.go:117] "RemoveContainer" containerID="fd90a82aa572442ced06c826d38b474d6390c6e7d82f45a29dae451b3980c5f5" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.381275 4699 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/058a0faa-f3ce-4c0e-b7f0-e3f915f5d887-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.381338 4699 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/058a0faa-f3ce-4c0e-b7f0-e3f915f5d887-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.381353 4699 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/058a0faa-f3ce-4c0e-b7f0-e3f915f5d887-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.381366 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/058a0faa-f3ce-4c0e-b7f0-e3f915f5d887-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.381378 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpbvs\" (UniqueName: \"kubernetes.io/projected/058a0faa-f3ce-4c0e-b7f0-e3f915f5d887-kube-api-access-qpbvs\") on node \"crc\" DevicePath \"\"" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.387107 4699 scope.go:117] "RemoveContainer" containerID="d21f1d725f135279f9e93a053c515bdc28bb1fdf2ccefb2b4ce030a5416640ad" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.396805 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/058a0faa-f3ce-4c0e-b7f0-e3f915f5d887-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "058a0faa-f3ce-4c0e-b7f0-e3f915f5d887" (UID: "058a0faa-f3ce-4c0e-b7f0-e3f915f5d887"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.415153 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/058a0faa-f3ce-4c0e-b7f0-e3f915f5d887-config-data" (OuterVolumeSpecName: "config-data") pod "058a0faa-f3ce-4c0e-b7f0-e3f915f5d887" (UID: "058a0faa-f3ce-4c0e-b7f0-e3f915f5d887"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.463768 4699 scope.go:117] "RemoveContainer" containerID="9134c741b6b4714ede5a4c6025c9bb45805c0e6438ba8f14898d5b13c7824107" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.483644 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/058a0faa-f3ce-4c0e-b7f0-e3f915f5d887-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.483682 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/058a0faa-f3ce-4c0e-b7f0-e3f915f5d887-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.519679 4699 scope.go:117] "RemoveContainer" containerID="525cca3afb005e5f7a914dc11e05a682eca50a4d4bc411482d0e8680e6f7729c" Nov 22 04:27:59 crc kubenswrapper[4699]: E1122 04:27:59.520844 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"525cca3afb005e5f7a914dc11e05a682eca50a4d4bc411482d0e8680e6f7729c\": container with ID starting with 525cca3afb005e5f7a914dc11e05a682eca50a4d4bc411482d0e8680e6f7729c not found: ID does not exist" containerID="525cca3afb005e5f7a914dc11e05a682eca50a4d4bc411482d0e8680e6f7729c" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.520876 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"525cca3afb005e5f7a914dc11e05a682eca50a4d4bc411482d0e8680e6f7729c"} err="failed to get container status \"525cca3afb005e5f7a914dc11e05a682eca50a4d4bc411482d0e8680e6f7729c\": rpc error: code = NotFound desc = could not find container \"525cca3afb005e5f7a914dc11e05a682eca50a4d4bc411482d0e8680e6f7729c\": container with ID starting with 525cca3afb005e5f7a914dc11e05a682eca50a4d4bc411482d0e8680e6f7729c not found: ID does not exist" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.520910 4699 scope.go:117] "RemoveContainer" containerID="fd90a82aa572442ced06c826d38b474d6390c6e7d82f45a29dae451b3980c5f5" Nov 22 04:27:59 crc kubenswrapper[4699]: E1122 04:27:59.521583 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd90a82aa572442ced06c826d38b474d6390c6e7d82f45a29dae451b3980c5f5\": container with ID starting with fd90a82aa572442ced06c826d38b474d6390c6e7d82f45a29dae451b3980c5f5 not found: ID does not exist" containerID="fd90a82aa572442ced06c826d38b474d6390c6e7d82f45a29dae451b3980c5f5" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.521646 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd90a82aa572442ced06c826d38b474d6390c6e7d82f45a29dae451b3980c5f5"} err="failed to get container status \"fd90a82aa572442ced06c826d38b474d6390c6e7d82f45a29dae451b3980c5f5\": rpc error: code = NotFound desc = could not find container \"fd90a82aa572442ced06c826d38b474d6390c6e7d82f45a29dae451b3980c5f5\": container with ID starting with fd90a82aa572442ced06c826d38b474d6390c6e7d82f45a29dae451b3980c5f5 not found: ID does not exist" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.521692 4699 scope.go:117] "RemoveContainer" containerID="d21f1d725f135279f9e93a053c515bdc28bb1fdf2ccefb2b4ce030a5416640ad" Nov 22 04:27:59 crc kubenswrapper[4699]: E1122 04:27:59.522067 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d21f1d725f135279f9e93a053c515bdc28bb1fdf2ccefb2b4ce030a5416640ad\": container with ID starting with d21f1d725f135279f9e93a053c515bdc28bb1fdf2ccefb2b4ce030a5416640ad not found: ID does not exist" containerID="d21f1d725f135279f9e93a053c515bdc28bb1fdf2ccefb2b4ce030a5416640ad" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.522093 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d21f1d725f135279f9e93a053c515bdc28bb1fdf2ccefb2b4ce030a5416640ad"} err="failed to get container status \"d21f1d725f135279f9e93a053c515bdc28bb1fdf2ccefb2b4ce030a5416640ad\": rpc error: code = NotFound desc = could not find container \"d21f1d725f135279f9e93a053c515bdc28bb1fdf2ccefb2b4ce030a5416640ad\": container with ID starting with d21f1d725f135279f9e93a053c515bdc28bb1fdf2ccefb2b4ce030a5416640ad not found: ID does not exist" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.522106 4699 scope.go:117] "RemoveContainer" containerID="9134c741b6b4714ede5a4c6025c9bb45805c0e6438ba8f14898d5b13c7824107" Nov 22 04:27:59 crc kubenswrapper[4699]: E1122 04:27:59.522360 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9134c741b6b4714ede5a4c6025c9bb45805c0e6438ba8f14898d5b13c7824107\": container with ID starting with 9134c741b6b4714ede5a4c6025c9bb45805c0e6438ba8f14898d5b13c7824107 not found: ID does not exist" containerID="9134c741b6b4714ede5a4c6025c9bb45805c0e6438ba8f14898d5b13c7824107" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.522385 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9134c741b6b4714ede5a4c6025c9bb45805c0e6438ba8f14898d5b13c7824107"} err="failed to get container status \"9134c741b6b4714ede5a4c6025c9bb45805c0e6438ba8f14898d5b13c7824107\": rpc error: code = NotFound desc = could not find container \"9134c741b6b4714ede5a4c6025c9bb45805c0e6438ba8f14898d5b13c7824107\": container with ID starting with 9134c741b6b4714ede5a4c6025c9bb45805c0e6438ba8f14898d5b13c7824107 not found: ID does not exist" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.652302 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.663252 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.673719 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 22 04:27:59 crc kubenswrapper[4699]: E1122 04:27:59.674336 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="058a0faa-f3ce-4c0e-b7f0-e3f915f5d887" containerName="ceilometer-central-agent" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.674357 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="058a0faa-f3ce-4c0e-b7f0-e3f915f5d887" containerName="ceilometer-central-agent" Nov 22 04:27:59 crc kubenswrapper[4699]: E1122 04:27:59.674376 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="058a0faa-f3ce-4c0e-b7f0-e3f915f5d887" containerName="sg-core" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.674384 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="058a0faa-f3ce-4c0e-b7f0-e3f915f5d887" containerName="sg-core" Nov 22 04:27:59 crc kubenswrapper[4699]: E1122 04:27:59.674405 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="058a0faa-f3ce-4c0e-b7f0-e3f915f5d887" containerName="proxy-httpd" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.674411 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="058a0faa-f3ce-4c0e-b7f0-e3f915f5d887" containerName="proxy-httpd" Nov 22 04:27:59 crc kubenswrapper[4699]: E1122 04:27:59.674463 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="058a0faa-f3ce-4c0e-b7f0-e3f915f5d887" containerName="ceilometer-notification-agent" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.674469 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="058a0faa-f3ce-4c0e-b7f0-e3f915f5d887" containerName="ceilometer-notification-agent" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.674672 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="058a0faa-f3ce-4c0e-b7f0-e3f915f5d887" containerName="sg-core" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.674691 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="058a0faa-f3ce-4c0e-b7f0-e3f915f5d887" containerName="proxy-httpd" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.674704 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="058a0faa-f3ce-4c0e-b7f0-e3f915f5d887" containerName="ceilometer-central-agent" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.674714 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="058a0faa-f3ce-4c0e-b7f0-e3f915f5d887" containerName="ceilometer-notification-agent" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.676587 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.679364 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.679618 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.683958 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.789893 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0ba3e973-312b-4343-a4c7-c6ab4a412703-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0ba3e973-312b-4343-a4c7-c6ab4a412703\") " pod="openstack/ceilometer-0" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.790085 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ba3e973-312b-4343-a4c7-c6ab4a412703-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0ba3e973-312b-4343-a4c7-c6ab4a412703\") " pod="openstack/ceilometer-0" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.790179 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ba3e973-312b-4343-a4c7-c6ab4a412703-scripts\") pod \"ceilometer-0\" (UID: \"0ba3e973-312b-4343-a4c7-c6ab4a412703\") " pod="openstack/ceilometer-0" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.790272 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ba3e973-312b-4343-a4c7-c6ab4a412703-config-data\") pod \"ceilometer-0\" (UID: \"0ba3e973-312b-4343-a4c7-c6ab4a412703\") " pod="openstack/ceilometer-0" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.790340 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59vjt\" (UniqueName: \"kubernetes.io/projected/0ba3e973-312b-4343-a4c7-c6ab4a412703-kube-api-access-59vjt\") pod \"ceilometer-0\" (UID: \"0ba3e973-312b-4343-a4c7-c6ab4a412703\") " pod="openstack/ceilometer-0" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.790539 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ba3e973-312b-4343-a4c7-c6ab4a412703-run-httpd\") pod \"ceilometer-0\" (UID: \"0ba3e973-312b-4343-a4c7-c6ab4a412703\") " pod="openstack/ceilometer-0" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.790575 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ba3e973-312b-4343-a4c7-c6ab4a412703-log-httpd\") pod \"ceilometer-0\" (UID: \"0ba3e973-312b-4343-a4c7-c6ab4a412703\") " pod="openstack/ceilometer-0" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.892016 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59vjt\" (UniqueName: \"kubernetes.io/projected/0ba3e973-312b-4343-a4c7-c6ab4a412703-kube-api-access-59vjt\") pod \"ceilometer-0\" (UID: \"0ba3e973-312b-4343-a4c7-c6ab4a412703\") " pod="openstack/ceilometer-0" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.892179 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ba3e973-312b-4343-a4c7-c6ab4a412703-run-httpd\") pod \"ceilometer-0\" (UID: \"0ba3e973-312b-4343-a4c7-c6ab4a412703\") " pod="openstack/ceilometer-0" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.892204 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ba3e973-312b-4343-a4c7-c6ab4a412703-log-httpd\") pod \"ceilometer-0\" (UID: \"0ba3e973-312b-4343-a4c7-c6ab4a412703\") " pod="openstack/ceilometer-0" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.893127 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ba3e973-312b-4343-a4c7-c6ab4a412703-run-httpd\") pod \"ceilometer-0\" (UID: \"0ba3e973-312b-4343-a4c7-c6ab4a412703\") " pod="openstack/ceilometer-0" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.893199 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ba3e973-312b-4343-a4c7-c6ab4a412703-log-httpd\") pod \"ceilometer-0\" (UID: \"0ba3e973-312b-4343-a4c7-c6ab4a412703\") " pod="openstack/ceilometer-0" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.898652 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0ba3e973-312b-4343-a4c7-c6ab4a412703-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0ba3e973-312b-4343-a4c7-c6ab4a412703\") " pod="openstack/ceilometer-0" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.899573 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ba3e973-312b-4343-a4c7-c6ab4a412703-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0ba3e973-312b-4343-a4c7-c6ab4a412703\") " pod="openstack/ceilometer-0" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.899638 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ba3e973-312b-4343-a4c7-c6ab4a412703-scripts\") pod \"ceilometer-0\" (UID: \"0ba3e973-312b-4343-a4c7-c6ab4a412703\") " pod="openstack/ceilometer-0" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.899695 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ba3e973-312b-4343-a4c7-c6ab4a412703-config-data\") pod \"ceilometer-0\" (UID: \"0ba3e973-312b-4343-a4c7-c6ab4a412703\") " pod="openstack/ceilometer-0" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.905335 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ba3e973-312b-4343-a4c7-c6ab4a412703-scripts\") pod \"ceilometer-0\" (UID: \"0ba3e973-312b-4343-a4c7-c6ab4a412703\") " pod="openstack/ceilometer-0" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.907632 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ba3e973-312b-4343-a4c7-c6ab4a412703-config-data\") pod \"ceilometer-0\" (UID: \"0ba3e973-312b-4343-a4c7-c6ab4a412703\") " pod="openstack/ceilometer-0" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.908153 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0ba3e973-312b-4343-a4c7-c6ab4a412703-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0ba3e973-312b-4343-a4c7-c6ab4a412703\") " pod="openstack/ceilometer-0" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.926667 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ba3e973-312b-4343-a4c7-c6ab4a412703-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0ba3e973-312b-4343-a4c7-c6ab4a412703\") " pod="openstack/ceilometer-0" Nov 22 04:27:59 crc kubenswrapper[4699]: I1122 04:27:59.934077 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59vjt\" (UniqueName: \"kubernetes.io/projected/0ba3e973-312b-4343-a4c7-c6ab4a412703-kube-api-access-59vjt\") pod \"ceilometer-0\" (UID: \"0ba3e973-312b-4343-a4c7-c6ab4a412703\") " pod="openstack/ceilometer-0" Nov 22 04:28:00 crc kubenswrapper[4699]: I1122 04:28:00.018463 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 04:28:00 crc kubenswrapper[4699]: I1122 04:28:00.331344 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7f47885746-l8msw"] Nov 22 04:28:00 crc kubenswrapper[4699]: I1122 04:28:00.333351 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7f47885746-l8msw" Nov 22 04:28:00 crc kubenswrapper[4699]: I1122 04:28:00.339625 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Nov 22 04:28:00 crc kubenswrapper[4699]: I1122 04:28:00.341344 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Nov 22 04:28:00 crc kubenswrapper[4699]: I1122 04:28:00.344559 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76d968d474-5l6z2" event={"ID":"eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe","Type":"ContainerStarted","Data":"eefbea6d5fb036522c9edff0ec95f12837269e0a104e6d5bb2dec3833687276a"} Nov 22 04:28:00 crc kubenswrapper[4699]: I1122 04:28:00.344669 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-76d968d474-5l6z2" Nov 22 04:28:00 crc kubenswrapper[4699]: I1122 04:28:00.344700 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-76d968d474-5l6z2" Nov 22 04:28:00 crc kubenswrapper[4699]: I1122 04:28:00.346240 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7f47885746-l8msw"] Nov 22 04:28:00 crc kubenswrapper[4699]: I1122 04:28:00.347366 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-f9dfc" event={"ID":"2036e588-c3af-438b-9d44-6f77609be731","Type":"ContainerStarted","Data":"f2069c5bc1774fa1252817c3ef57116a0e1c8c8773cfb60a8d1ed883db685676"} Nov 22 04:28:00 crc kubenswrapper[4699]: I1122 04:28:00.347484 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-586bdc5f9-f9dfc" Nov 22 04:28:00 crc kubenswrapper[4699]: I1122 04:28:00.378881 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-586bdc5f9-f9dfc" podStartSLOduration=3.378856446 podStartE2EDuration="3.378856446s" podCreationTimestamp="2025-11-22 04:27:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:28:00.373513297 +0000 UTC m=+1231.716134494" watchObservedRunningTime="2025-11-22 04:28:00.378856446 +0000 UTC m=+1231.721477633" Nov 22 04:28:00 crc kubenswrapper[4699]: I1122 04:28:00.400294 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-76d968d474-5l6z2" podStartSLOduration=3.400276296 podStartE2EDuration="3.400276296s" podCreationTimestamp="2025-11-22 04:27:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:28:00.39797348 +0000 UTC m=+1231.740594667" watchObservedRunningTime="2025-11-22 04:28:00.400276296 +0000 UTC m=+1231.742897483" Nov 22 04:28:00 crc kubenswrapper[4699]: I1122 04:28:00.410577 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4c29f46a-251d-4422-a524-d5745603c348-config-data-custom\") pod \"barbican-api-7f47885746-l8msw\" (UID: \"4c29f46a-251d-4422-a524-d5745603c348\") " pod="openstack/barbican-api-7f47885746-l8msw" Nov 22 04:28:00 crc kubenswrapper[4699]: I1122 04:28:00.410715 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c29f46a-251d-4422-a524-d5745603c348-public-tls-certs\") pod \"barbican-api-7f47885746-l8msw\" (UID: \"4c29f46a-251d-4422-a524-d5745603c348\") " pod="openstack/barbican-api-7f47885746-l8msw" Nov 22 04:28:00 crc kubenswrapper[4699]: I1122 04:28:00.410740 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c29f46a-251d-4422-a524-d5745603c348-config-data\") pod \"barbican-api-7f47885746-l8msw\" (UID: \"4c29f46a-251d-4422-a524-d5745603c348\") " pod="openstack/barbican-api-7f47885746-l8msw" Nov 22 04:28:00 crc kubenswrapper[4699]: I1122 04:28:00.411584 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpm9b\" (UniqueName: \"kubernetes.io/projected/4c29f46a-251d-4422-a524-d5745603c348-kube-api-access-tpm9b\") pod \"barbican-api-7f47885746-l8msw\" (UID: \"4c29f46a-251d-4422-a524-d5745603c348\") " pod="openstack/barbican-api-7f47885746-l8msw" Nov 22 04:28:00 crc kubenswrapper[4699]: I1122 04:28:00.411625 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c29f46a-251d-4422-a524-d5745603c348-logs\") pod \"barbican-api-7f47885746-l8msw\" (UID: \"4c29f46a-251d-4422-a524-d5745603c348\") " pod="openstack/barbican-api-7f47885746-l8msw" Nov 22 04:28:00 crc kubenswrapper[4699]: I1122 04:28:00.411656 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c29f46a-251d-4422-a524-d5745603c348-internal-tls-certs\") pod \"barbican-api-7f47885746-l8msw\" (UID: \"4c29f46a-251d-4422-a524-d5745603c348\") " pod="openstack/barbican-api-7f47885746-l8msw" Nov 22 04:28:00 crc kubenswrapper[4699]: I1122 04:28:00.411717 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c29f46a-251d-4422-a524-d5745603c348-combined-ca-bundle\") pod \"barbican-api-7f47885746-l8msw\" (UID: \"4c29f46a-251d-4422-a524-d5745603c348\") " pod="openstack/barbican-api-7f47885746-l8msw" Nov 22 04:28:00 crc kubenswrapper[4699]: I1122 04:28:00.513750 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4c29f46a-251d-4422-a524-d5745603c348-config-data-custom\") pod \"barbican-api-7f47885746-l8msw\" (UID: \"4c29f46a-251d-4422-a524-d5745603c348\") " pod="openstack/barbican-api-7f47885746-l8msw" Nov 22 04:28:00 crc kubenswrapper[4699]: I1122 04:28:00.514245 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c29f46a-251d-4422-a524-d5745603c348-public-tls-certs\") pod \"barbican-api-7f47885746-l8msw\" (UID: \"4c29f46a-251d-4422-a524-d5745603c348\") " pod="openstack/barbican-api-7f47885746-l8msw" Nov 22 04:28:00 crc kubenswrapper[4699]: I1122 04:28:00.514283 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c29f46a-251d-4422-a524-d5745603c348-config-data\") pod \"barbican-api-7f47885746-l8msw\" (UID: \"4c29f46a-251d-4422-a524-d5745603c348\") " pod="openstack/barbican-api-7f47885746-l8msw" Nov 22 04:28:00 crc kubenswrapper[4699]: I1122 04:28:00.514347 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpm9b\" (UniqueName: \"kubernetes.io/projected/4c29f46a-251d-4422-a524-d5745603c348-kube-api-access-tpm9b\") pod \"barbican-api-7f47885746-l8msw\" (UID: \"4c29f46a-251d-4422-a524-d5745603c348\") " pod="openstack/barbican-api-7f47885746-l8msw" Nov 22 04:28:00 crc kubenswrapper[4699]: I1122 04:28:00.514387 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c29f46a-251d-4422-a524-d5745603c348-logs\") pod \"barbican-api-7f47885746-l8msw\" (UID: \"4c29f46a-251d-4422-a524-d5745603c348\") " pod="openstack/barbican-api-7f47885746-l8msw" Nov 22 04:28:00 crc kubenswrapper[4699]: I1122 04:28:00.514458 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c29f46a-251d-4422-a524-d5745603c348-internal-tls-certs\") pod \"barbican-api-7f47885746-l8msw\" (UID: \"4c29f46a-251d-4422-a524-d5745603c348\") " pod="openstack/barbican-api-7f47885746-l8msw" Nov 22 04:28:00 crc kubenswrapper[4699]: I1122 04:28:00.514486 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c29f46a-251d-4422-a524-d5745603c348-combined-ca-bundle\") pod \"barbican-api-7f47885746-l8msw\" (UID: \"4c29f46a-251d-4422-a524-d5745603c348\") " pod="openstack/barbican-api-7f47885746-l8msw" Nov 22 04:28:00 crc kubenswrapper[4699]: I1122 04:28:00.559094 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c29f46a-251d-4422-a524-d5745603c348-logs\") pod \"barbican-api-7f47885746-l8msw\" (UID: \"4c29f46a-251d-4422-a524-d5745603c348\") " pod="openstack/barbican-api-7f47885746-l8msw" Nov 22 04:28:00 crc kubenswrapper[4699]: I1122 04:28:00.565053 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c29f46a-251d-4422-a524-d5745603c348-internal-tls-certs\") pod \"barbican-api-7f47885746-l8msw\" (UID: \"4c29f46a-251d-4422-a524-d5745603c348\") " pod="openstack/barbican-api-7f47885746-l8msw" Nov 22 04:28:00 crc kubenswrapper[4699]: I1122 04:28:00.565131 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c29f46a-251d-4422-a524-d5745603c348-combined-ca-bundle\") pod \"barbican-api-7f47885746-l8msw\" (UID: \"4c29f46a-251d-4422-a524-d5745603c348\") " pod="openstack/barbican-api-7f47885746-l8msw" Nov 22 04:28:00 crc kubenswrapper[4699]: I1122 04:28:00.565187 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpm9b\" (UniqueName: \"kubernetes.io/projected/4c29f46a-251d-4422-a524-d5745603c348-kube-api-access-tpm9b\") pod \"barbican-api-7f47885746-l8msw\" (UID: \"4c29f46a-251d-4422-a524-d5745603c348\") " pod="openstack/barbican-api-7f47885746-l8msw" Nov 22 04:28:00 crc kubenswrapper[4699]: I1122 04:28:00.566929 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c29f46a-251d-4422-a524-d5745603c348-public-tls-certs\") pod \"barbican-api-7f47885746-l8msw\" (UID: \"4c29f46a-251d-4422-a524-d5745603c348\") " pod="openstack/barbican-api-7f47885746-l8msw" Nov 22 04:28:00 crc kubenswrapper[4699]: I1122 04:28:00.566999 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4c29f46a-251d-4422-a524-d5745603c348-config-data-custom\") pod \"barbican-api-7f47885746-l8msw\" (UID: \"4c29f46a-251d-4422-a524-d5745603c348\") " pod="openstack/barbican-api-7f47885746-l8msw" Nov 22 04:28:00 crc kubenswrapper[4699]: I1122 04:28:00.567962 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c29f46a-251d-4422-a524-d5745603c348-config-data\") pod \"barbican-api-7f47885746-l8msw\" (UID: \"4c29f46a-251d-4422-a524-d5745603c348\") " pod="openstack/barbican-api-7f47885746-l8msw" Nov 22 04:28:00 crc kubenswrapper[4699]: I1122 04:28:00.860344 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7f47885746-l8msw" Nov 22 04:28:00 crc kubenswrapper[4699]: I1122 04:28:00.883266 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 04:28:01 crc kubenswrapper[4699]: I1122 04:28:01.365784 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ba3e973-312b-4343-a4c7-c6ab4a412703","Type":"ContainerStarted","Data":"12ea25681f2e105e75ac9c61ea5fbff5851db9a5984db83cb477a0790f60921a"} Nov 22 04:28:01 crc kubenswrapper[4699]: I1122 04:28:01.406921 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7f47885746-l8msw"] Nov 22 04:28:01 crc kubenswrapper[4699]: I1122 04:28:01.459373 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="058a0faa-f3ce-4c0e-b7f0-e3f915f5d887" path="/var/lib/kubelet/pods/058a0faa-f3ce-4c0e-b7f0-e3f915f5d887/volumes" Nov 22 04:28:02 crc kubenswrapper[4699]: W1122 04:28:02.133212 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c29f46a_251d_4422_a524_d5745603c348.slice/crio-c47af913122cc4ad86449f013f67b7bfa332c16497ece3bd5134cc060d9b6fd3 WatchSource:0}: Error finding container c47af913122cc4ad86449f013f67b7bfa332c16497ece3bd5134cc060d9b6fd3: Status 404 returned error can't find the container with id c47af913122cc4ad86449f013f67b7bfa332c16497ece3bd5134cc060d9b6fd3 Nov 22 04:28:02 crc kubenswrapper[4699]: I1122 04:28:02.374247 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7f47885746-l8msw" event={"ID":"4c29f46a-251d-4422-a524-d5745603c348","Type":"ContainerStarted","Data":"c47af913122cc4ad86449f013f67b7bfa332c16497ece3bd5134cc060d9b6fd3"} Nov 22 04:28:03 crc kubenswrapper[4699]: I1122 04:28:03.382018 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ba3e973-312b-4343-a4c7-c6ab4a412703","Type":"ContainerStarted","Data":"c3ad54eab486e22378cf2f42efc53505fff8c1abc1e688c04363e261ec5bb886"} Nov 22 04:28:03 crc kubenswrapper[4699]: I1122 04:28:03.383313 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-78b588d944-t7d25" event={"ID":"da5bf8fa-2592-445a-acfc-56e044b4291c","Type":"ContainerStarted","Data":"ce481186d1c8134738371726c40566e0e90b292566519c285040634501496d06"} Nov 22 04:28:03 crc kubenswrapper[4699]: I1122 04:28:03.383340 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-78b588d944-t7d25" event={"ID":"da5bf8fa-2592-445a-acfc-56e044b4291c","Type":"ContainerStarted","Data":"32c0bf73e4ca373e26c0c0a97ecfdcc867eda0863603bc970bfad74978113e6f"} Nov 22 04:28:03 crc kubenswrapper[4699]: I1122 04:28:03.387616 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-85df448b85-c7qlg" event={"ID":"6434b63e-cd0f-4cc2-aa3e-463cbf9e7800","Type":"ContainerStarted","Data":"4dfcadff378cf07a39c333156d50e37bc31858667b8fffd1550b955fdf6aeefc"} Nov 22 04:28:03 crc kubenswrapper[4699]: I1122 04:28:03.387667 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-85df448b85-c7qlg" event={"ID":"6434b63e-cd0f-4cc2-aa3e-463cbf9e7800","Type":"ContainerStarted","Data":"a54d5af43190b278d945b57826724ac6eb89b451b5a556805a9f942b8cd57ce4"} Nov 22 04:28:03 crc kubenswrapper[4699]: I1122 04:28:03.389622 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7f47885746-l8msw" event={"ID":"4c29f46a-251d-4422-a524-d5745603c348","Type":"ContainerStarted","Data":"06e7a381c39bd32ec8ffca7506e9533e5505c67042a220f0be7efc337b1bf372"} Nov 22 04:28:03 crc kubenswrapper[4699]: I1122 04:28:03.389667 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7f47885746-l8msw" event={"ID":"4c29f46a-251d-4422-a524-d5745603c348","Type":"ContainerStarted","Data":"c9fbd5261d0880451bc44b488de43aca57a590481a2c656a3596694b2bad73a3"} Nov 22 04:28:03 crc kubenswrapper[4699]: I1122 04:28:03.390318 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7f47885746-l8msw" Nov 22 04:28:03 crc kubenswrapper[4699]: I1122 04:28:03.390348 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7f47885746-l8msw" Nov 22 04:28:03 crc kubenswrapper[4699]: I1122 04:28:03.393211 4699 generic.go:334] "Generic (PLEG): container finished" podID="a2442edb-5370-4fd9-af87-6cb17498cee6" containerID="96132b23ec10daaf82debb2041fe6a9acbd17fb95fd2543bf6497612064673cd" exitCode=0 Nov 22 04:28:03 crc kubenswrapper[4699]: I1122 04:28:03.393269 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fgx2c" event={"ID":"a2442edb-5370-4fd9-af87-6cb17498cee6","Type":"ContainerDied","Data":"96132b23ec10daaf82debb2041fe6a9acbd17fb95fd2543bf6497612064673cd"} Nov 22 04:28:03 crc kubenswrapper[4699]: I1122 04:28:03.406621 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-78b588d944-t7d25" podStartSLOduration=2.565475986 podStartE2EDuration="6.406602503s" podCreationTimestamp="2025-11-22 04:27:57 +0000 UTC" firstStartedPulling="2025-11-22 04:27:58.646523986 +0000 UTC m=+1229.989145173" lastFinishedPulling="2025-11-22 04:28:02.487650503 +0000 UTC m=+1233.830271690" observedRunningTime="2025-11-22 04:28:03.401281494 +0000 UTC m=+1234.743902691" watchObservedRunningTime="2025-11-22 04:28:03.406602503 +0000 UTC m=+1234.749223690" Nov 22 04:28:03 crc kubenswrapper[4699]: I1122 04:28:03.438637 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-85df448b85-c7qlg" podStartSLOduration=2.426407085 podStartE2EDuration="6.438619369s" podCreationTimestamp="2025-11-22 04:27:57 +0000 UTC" firstStartedPulling="2025-11-22 04:27:58.452025021 +0000 UTC m=+1229.794646208" lastFinishedPulling="2025-11-22 04:28:02.464237305 +0000 UTC m=+1233.806858492" observedRunningTime="2025-11-22 04:28:03.437484012 +0000 UTC m=+1234.780105209" watchObservedRunningTime="2025-11-22 04:28:03.438619369 +0000 UTC m=+1234.781240556" Nov 22 04:28:03 crc kubenswrapper[4699]: I1122 04:28:03.454534 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7f47885746-l8msw" podStartSLOduration=3.454517515 podStartE2EDuration="3.454517515s" podCreationTimestamp="2025-11-22 04:28:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:28:03.452804003 +0000 UTC m=+1234.795425210" watchObservedRunningTime="2025-11-22 04:28:03.454517515 +0000 UTC m=+1234.797138702" Nov 22 04:28:04 crc kubenswrapper[4699]: I1122 04:28:04.404652 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ba3e973-312b-4343-a4c7-c6ab4a412703","Type":"ContainerStarted","Data":"00ee3fc918621b777b1c7549f562a1fe9dbd3a33ed4842f07528f68a5c4fd1de"} Nov 22 04:28:04 crc kubenswrapper[4699]: I1122 04:28:04.871600 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fgx2c" Nov 22 04:28:04 crc kubenswrapper[4699]: I1122 04:28:04.904964 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a2442edb-5370-4fd9-af87-6cb17498cee6-db-sync-config-data\") pod \"a2442edb-5370-4fd9-af87-6cb17498cee6\" (UID: \"a2442edb-5370-4fd9-af87-6cb17498cee6\") " Nov 22 04:28:04 crc kubenswrapper[4699]: I1122 04:28:04.905118 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2442edb-5370-4fd9-af87-6cb17498cee6-combined-ca-bundle\") pod \"a2442edb-5370-4fd9-af87-6cb17498cee6\" (UID: \"a2442edb-5370-4fd9-af87-6cb17498cee6\") " Nov 22 04:28:04 crc kubenswrapper[4699]: I1122 04:28:04.905155 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzdkb\" (UniqueName: \"kubernetes.io/projected/a2442edb-5370-4fd9-af87-6cb17498cee6-kube-api-access-xzdkb\") pod \"a2442edb-5370-4fd9-af87-6cb17498cee6\" (UID: \"a2442edb-5370-4fd9-af87-6cb17498cee6\") " Nov 22 04:28:04 crc kubenswrapper[4699]: I1122 04:28:04.905182 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2442edb-5370-4fd9-af87-6cb17498cee6-config-data\") pod \"a2442edb-5370-4fd9-af87-6cb17498cee6\" (UID: \"a2442edb-5370-4fd9-af87-6cb17498cee6\") " Nov 22 04:28:04 crc kubenswrapper[4699]: I1122 04:28:04.905202 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a2442edb-5370-4fd9-af87-6cb17498cee6-etc-machine-id\") pod \"a2442edb-5370-4fd9-af87-6cb17498cee6\" (UID: \"a2442edb-5370-4fd9-af87-6cb17498cee6\") " Nov 22 04:28:04 crc kubenswrapper[4699]: I1122 04:28:04.905226 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2442edb-5370-4fd9-af87-6cb17498cee6-scripts\") pod \"a2442edb-5370-4fd9-af87-6cb17498cee6\" (UID: \"a2442edb-5370-4fd9-af87-6cb17498cee6\") " Nov 22 04:28:04 crc kubenswrapper[4699]: I1122 04:28:04.907780 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a2442edb-5370-4fd9-af87-6cb17498cee6-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a2442edb-5370-4fd9-af87-6cb17498cee6" (UID: "a2442edb-5370-4fd9-af87-6cb17498cee6"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:28:04 crc kubenswrapper[4699]: I1122 04:28:04.912531 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2442edb-5370-4fd9-af87-6cb17498cee6-scripts" (OuterVolumeSpecName: "scripts") pod "a2442edb-5370-4fd9-af87-6cb17498cee6" (UID: "a2442edb-5370-4fd9-af87-6cb17498cee6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:28:04 crc kubenswrapper[4699]: I1122 04:28:04.919566 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2442edb-5370-4fd9-af87-6cb17498cee6-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a2442edb-5370-4fd9-af87-6cb17498cee6" (UID: "a2442edb-5370-4fd9-af87-6cb17498cee6"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:28:04 crc kubenswrapper[4699]: I1122 04:28:04.919609 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2442edb-5370-4fd9-af87-6cb17498cee6-kube-api-access-xzdkb" (OuterVolumeSpecName: "kube-api-access-xzdkb") pod "a2442edb-5370-4fd9-af87-6cb17498cee6" (UID: "a2442edb-5370-4fd9-af87-6cb17498cee6"). InnerVolumeSpecName "kube-api-access-xzdkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:28:04 crc kubenswrapper[4699]: I1122 04:28:04.934885 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2442edb-5370-4fd9-af87-6cb17498cee6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a2442edb-5370-4fd9-af87-6cb17498cee6" (UID: "a2442edb-5370-4fd9-af87-6cb17498cee6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:28:04 crc kubenswrapper[4699]: I1122 04:28:04.970519 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2442edb-5370-4fd9-af87-6cb17498cee6-config-data" (OuterVolumeSpecName: "config-data") pod "a2442edb-5370-4fd9-af87-6cb17498cee6" (UID: "a2442edb-5370-4fd9-af87-6cb17498cee6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:28:05 crc kubenswrapper[4699]: I1122 04:28:05.006971 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2442edb-5370-4fd9-af87-6cb17498cee6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:05 crc kubenswrapper[4699]: I1122 04:28:05.007017 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzdkb\" (UniqueName: \"kubernetes.io/projected/a2442edb-5370-4fd9-af87-6cb17498cee6-kube-api-access-xzdkb\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:05 crc kubenswrapper[4699]: I1122 04:28:05.007031 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2442edb-5370-4fd9-af87-6cb17498cee6-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:05 crc kubenswrapper[4699]: I1122 04:28:05.007040 4699 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a2442edb-5370-4fd9-af87-6cb17498cee6-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:05 crc kubenswrapper[4699]: I1122 04:28:05.007051 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2442edb-5370-4fd9-af87-6cb17498cee6-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:05 crc kubenswrapper[4699]: I1122 04:28:05.007060 4699 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a2442edb-5370-4fd9-af87-6cb17498cee6-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:05 crc kubenswrapper[4699]: I1122 04:28:05.417321 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fgx2c" event={"ID":"a2442edb-5370-4fd9-af87-6cb17498cee6","Type":"ContainerDied","Data":"70f64905ce8944a01a329b0de4744e0daff6439506d70264c41dd17374b711da"} Nov 22 04:28:05 crc kubenswrapper[4699]: I1122 04:28:05.417702 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70f64905ce8944a01a329b0de4744e0daff6439506d70264c41dd17374b711da" Nov 22 04:28:05 crc kubenswrapper[4699]: I1122 04:28:05.417331 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fgx2c" Nov 22 04:28:05 crc kubenswrapper[4699]: I1122 04:28:05.419924 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ba3e973-312b-4343-a4c7-c6ab4a412703","Type":"ContainerStarted","Data":"50d65d3f9ebe4574d155557705664edbd18bac3c3589088e3c2f44d816f05d93"} Nov 22 04:28:05 crc kubenswrapper[4699]: I1122 04:28:05.686443 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 04:28:05 crc kubenswrapper[4699]: E1122 04:28:05.686910 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2442edb-5370-4fd9-af87-6cb17498cee6" containerName="cinder-db-sync" Nov 22 04:28:05 crc kubenswrapper[4699]: I1122 04:28:05.686927 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2442edb-5370-4fd9-af87-6cb17498cee6" containerName="cinder-db-sync" Nov 22 04:28:05 crc kubenswrapper[4699]: I1122 04:28:05.687109 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2442edb-5370-4fd9-af87-6cb17498cee6" containerName="cinder-db-sync" Nov 22 04:28:05 crc kubenswrapper[4699]: I1122 04:28:05.688083 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 22 04:28:05 crc kubenswrapper[4699]: I1122 04:28:05.694034 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Nov 22 04:28:05 crc kubenswrapper[4699]: I1122 04:28:05.694193 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Nov 22 04:28:05 crc kubenswrapper[4699]: I1122 04:28:05.694481 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Nov 22 04:28:05 crc kubenswrapper[4699]: I1122 04:28:05.694511 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-s2lsl" Nov 22 04:28:05 crc kubenswrapper[4699]: I1122 04:28:05.728200 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzp8k\" (UniqueName: \"kubernetes.io/projected/5396825b-8417-449a-90cb-c0755b9d83a4-kube-api-access-gzp8k\") pod \"cinder-scheduler-0\" (UID: \"5396825b-8417-449a-90cb-c0755b9d83a4\") " pod="openstack/cinder-scheduler-0" Nov 22 04:28:05 crc kubenswrapper[4699]: I1122 04:28:05.728359 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5396825b-8417-449a-90cb-c0755b9d83a4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5396825b-8417-449a-90cb-c0755b9d83a4\") " pod="openstack/cinder-scheduler-0" Nov 22 04:28:05 crc kubenswrapper[4699]: I1122 04:28:05.728561 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5396825b-8417-449a-90cb-c0755b9d83a4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5396825b-8417-449a-90cb-c0755b9d83a4\") " pod="openstack/cinder-scheduler-0" Nov 22 04:28:05 crc kubenswrapper[4699]: I1122 04:28:05.728594 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5396825b-8417-449a-90cb-c0755b9d83a4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5396825b-8417-449a-90cb-c0755b9d83a4\") " pod="openstack/cinder-scheduler-0" Nov 22 04:28:05 crc kubenswrapper[4699]: I1122 04:28:05.728631 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5396825b-8417-449a-90cb-c0755b9d83a4-scripts\") pod \"cinder-scheduler-0\" (UID: \"5396825b-8417-449a-90cb-c0755b9d83a4\") " pod="openstack/cinder-scheduler-0" Nov 22 04:28:05 crc kubenswrapper[4699]: I1122 04:28:05.728708 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5396825b-8417-449a-90cb-c0755b9d83a4-config-data\") pod \"cinder-scheduler-0\" (UID: \"5396825b-8417-449a-90cb-c0755b9d83a4\") " pod="openstack/cinder-scheduler-0" Nov 22 04:28:05 crc kubenswrapper[4699]: I1122 04:28:05.761953 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 04:28:05 crc kubenswrapper[4699]: I1122 04:28:05.832554 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzp8k\" (UniqueName: \"kubernetes.io/projected/5396825b-8417-449a-90cb-c0755b9d83a4-kube-api-access-gzp8k\") pod \"cinder-scheduler-0\" (UID: \"5396825b-8417-449a-90cb-c0755b9d83a4\") " pod="openstack/cinder-scheduler-0" Nov 22 04:28:05 crc kubenswrapper[4699]: I1122 04:28:05.832869 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5396825b-8417-449a-90cb-c0755b9d83a4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5396825b-8417-449a-90cb-c0755b9d83a4\") " pod="openstack/cinder-scheduler-0" Nov 22 04:28:05 crc kubenswrapper[4699]: I1122 04:28:05.832999 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5396825b-8417-449a-90cb-c0755b9d83a4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5396825b-8417-449a-90cb-c0755b9d83a4\") " pod="openstack/cinder-scheduler-0" Nov 22 04:28:05 crc kubenswrapper[4699]: I1122 04:28:05.833100 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5396825b-8417-449a-90cb-c0755b9d83a4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5396825b-8417-449a-90cb-c0755b9d83a4\") " pod="openstack/cinder-scheduler-0" Nov 22 04:28:05 crc kubenswrapper[4699]: I1122 04:28:05.833218 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5396825b-8417-449a-90cb-c0755b9d83a4-scripts\") pod \"cinder-scheduler-0\" (UID: \"5396825b-8417-449a-90cb-c0755b9d83a4\") " pod="openstack/cinder-scheduler-0" Nov 22 04:28:05 crc kubenswrapper[4699]: I1122 04:28:05.833347 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5396825b-8417-449a-90cb-c0755b9d83a4-config-data\") pod \"cinder-scheduler-0\" (UID: \"5396825b-8417-449a-90cb-c0755b9d83a4\") " pod="openstack/cinder-scheduler-0" Nov 22 04:28:05 crc kubenswrapper[4699]: I1122 04:28:05.836558 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5396825b-8417-449a-90cb-c0755b9d83a4-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5396825b-8417-449a-90cb-c0755b9d83a4\") " pod="openstack/cinder-scheduler-0" Nov 22 04:28:05 crc kubenswrapper[4699]: I1122 04:28:05.847963 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-f9dfc"] Nov 22 04:28:05 crc kubenswrapper[4699]: I1122 04:28:05.848168 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-586bdc5f9-f9dfc" podUID="2036e588-c3af-438b-9d44-6f77609be731" containerName="dnsmasq-dns" containerID="cri-o://f2069c5bc1774fa1252817c3ef57116a0e1c8c8773cfb60a8d1ed883db685676" gracePeriod=10 Nov 22 04:28:05 crc kubenswrapper[4699]: I1122 04:28:05.850858 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5396825b-8417-449a-90cb-c0755b9d83a4-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5396825b-8417-449a-90cb-c0755b9d83a4\") " pod="openstack/cinder-scheduler-0" Nov 22 04:28:05 crc kubenswrapper[4699]: I1122 04:28:05.850963 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-586bdc5f9-f9dfc" Nov 22 04:28:05 crc kubenswrapper[4699]: I1122 04:28:05.851131 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5396825b-8417-449a-90cb-c0755b9d83a4-scripts\") pod \"cinder-scheduler-0\" (UID: \"5396825b-8417-449a-90cb-c0755b9d83a4\") " pod="openstack/cinder-scheduler-0" Nov 22 04:28:05 crc kubenswrapper[4699]: I1122 04:28:05.852023 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5396825b-8417-449a-90cb-c0755b9d83a4-config-data\") pod \"cinder-scheduler-0\" (UID: \"5396825b-8417-449a-90cb-c0755b9d83a4\") " pod="openstack/cinder-scheduler-0" Nov 22 04:28:05 crc kubenswrapper[4699]: I1122 04:28:05.873458 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5396825b-8417-449a-90cb-c0755b9d83a4-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5396825b-8417-449a-90cb-c0755b9d83a4\") " pod="openstack/cinder-scheduler-0" Nov 22 04:28:05 crc kubenswrapper[4699]: I1122 04:28:05.890399 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-795f4db4bc-w5624"] Nov 22 04:28:05 crc kubenswrapper[4699]: I1122 04:28:05.892094 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-795f4db4bc-w5624" Nov 22 04:28:05 crc kubenswrapper[4699]: I1122 04:28:05.896400 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzp8k\" (UniqueName: \"kubernetes.io/projected/5396825b-8417-449a-90cb-c0755b9d83a4-kube-api-access-gzp8k\") pod \"cinder-scheduler-0\" (UID: \"5396825b-8417-449a-90cb-c0755b9d83a4\") " pod="openstack/cinder-scheduler-0" Nov 22 04:28:05 crc kubenswrapper[4699]: I1122 04:28:05.934476 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bd2db7ab-0fd2-4a42-9096-518d476e670f-ovsdbserver-sb\") pod \"dnsmasq-dns-795f4db4bc-w5624\" (UID: \"bd2db7ab-0fd2-4a42-9096-518d476e670f\") " pod="openstack/dnsmasq-dns-795f4db4bc-w5624" Nov 22 04:28:05 crc kubenswrapper[4699]: I1122 04:28:05.934528 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd2db7ab-0fd2-4a42-9096-518d476e670f-dns-svc\") pod \"dnsmasq-dns-795f4db4bc-w5624\" (UID: \"bd2db7ab-0fd2-4a42-9096-518d476e670f\") " pod="openstack/dnsmasq-dns-795f4db4bc-w5624" Nov 22 04:28:05 crc kubenswrapper[4699]: I1122 04:28:05.934570 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bd2db7ab-0fd2-4a42-9096-518d476e670f-dns-swift-storage-0\") pod \"dnsmasq-dns-795f4db4bc-w5624\" (UID: \"bd2db7ab-0fd2-4a42-9096-518d476e670f\") " pod="openstack/dnsmasq-dns-795f4db4bc-w5624" Nov 22 04:28:05 crc kubenswrapper[4699]: I1122 04:28:05.934586 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bd2db7ab-0fd2-4a42-9096-518d476e670f-ovsdbserver-nb\") pod \"dnsmasq-dns-795f4db4bc-w5624\" (UID: \"bd2db7ab-0fd2-4a42-9096-518d476e670f\") " pod="openstack/dnsmasq-dns-795f4db4bc-w5624" Nov 22 04:28:05 crc kubenswrapper[4699]: I1122 04:28:05.934641 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmdpg\" (UniqueName: \"kubernetes.io/projected/bd2db7ab-0fd2-4a42-9096-518d476e670f-kube-api-access-tmdpg\") pod \"dnsmasq-dns-795f4db4bc-w5624\" (UID: \"bd2db7ab-0fd2-4a42-9096-518d476e670f\") " pod="openstack/dnsmasq-dns-795f4db4bc-w5624" Nov 22 04:28:05 crc kubenswrapper[4699]: I1122 04:28:05.934660 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd2db7ab-0fd2-4a42-9096-518d476e670f-config\") pod \"dnsmasq-dns-795f4db4bc-w5624\" (UID: \"bd2db7ab-0fd2-4a42-9096-518d476e670f\") " pod="openstack/dnsmasq-dns-795f4db4bc-w5624" Nov 22 04:28:05 crc kubenswrapper[4699]: I1122 04:28:05.997495 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-795f4db4bc-w5624"] Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.026152 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.038424 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd2db7ab-0fd2-4a42-9096-518d476e670f-dns-svc\") pod \"dnsmasq-dns-795f4db4bc-w5624\" (UID: \"bd2db7ab-0fd2-4a42-9096-518d476e670f\") " pod="openstack/dnsmasq-dns-795f4db4bc-w5624" Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.038523 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bd2db7ab-0fd2-4a42-9096-518d476e670f-dns-swift-storage-0\") pod \"dnsmasq-dns-795f4db4bc-w5624\" (UID: \"bd2db7ab-0fd2-4a42-9096-518d476e670f\") " pod="openstack/dnsmasq-dns-795f4db4bc-w5624" Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.038547 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bd2db7ab-0fd2-4a42-9096-518d476e670f-ovsdbserver-nb\") pod \"dnsmasq-dns-795f4db4bc-w5624\" (UID: \"bd2db7ab-0fd2-4a42-9096-518d476e670f\") " pod="openstack/dnsmasq-dns-795f4db4bc-w5624" Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.038625 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmdpg\" (UniqueName: \"kubernetes.io/projected/bd2db7ab-0fd2-4a42-9096-518d476e670f-kube-api-access-tmdpg\") pod \"dnsmasq-dns-795f4db4bc-w5624\" (UID: \"bd2db7ab-0fd2-4a42-9096-518d476e670f\") " pod="openstack/dnsmasq-dns-795f4db4bc-w5624" Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.038654 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd2db7ab-0fd2-4a42-9096-518d476e670f-config\") pod \"dnsmasq-dns-795f4db4bc-w5624\" (UID: \"bd2db7ab-0fd2-4a42-9096-518d476e670f\") " pod="openstack/dnsmasq-dns-795f4db4bc-w5624" Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.038765 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bd2db7ab-0fd2-4a42-9096-518d476e670f-ovsdbserver-sb\") pod \"dnsmasq-dns-795f4db4bc-w5624\" (UID: \"bd2db7ab-0fd2-4a42-9096-518d476e670f\") " pod="openstack/dnsmasq-dns-795f4db4bc-w5624" Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.039912 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bd2db7ab-0fd2-4a42-9096-518d476e670f-ovsdbserver-sb\") pod \"dnsmasq-dns-795f4db4bc-w5624\" (UID: \"bd2db7ab-0fd2-4a42-9096-518d476e670f\") " pod="openstack/dnsmasq-dns-795f4db4bc-w5624" Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.040328 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd2db7ab-0fd2-4a42-9096-518d476e670f-dns-svc\") pod \"dnsmasq-dns-795f4db4bc-w5624\" (UID: \"bd2db7ab-0fd2-4a42-9096-518d476e670f\") " pod="openstack/dnsmasq-dns-795f4db4bc-w5624" Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.041048 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bd2db7ab-0fd2-4a42-9096-518d476e670f-ovsdbserver-nb\") pod \"dnsmasq-dns-795f4db4bc-w5624\" (UID: \"bd2db7ab-0fd2-4a42-9096-518d476e670f\") " pod="openstack/dnsmasq-dns-795f4db4bc-w5624" Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.061815 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bd2db7ab-0fd2-4a42-9096-518d476e670f-dns-swift-storage-0\") pod \"dnsmasq-dns-795f4db4bc-w5624\" (UID: \"bd2db7ab-0fd2-4a42-9096-518d476e670f\") " pod="openstack/dnsmasq-dns-795f4db4bc-w5624" Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.065598 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd2db7ab-0fd2-4a42-9096-518d476e670f-config\") pod \"dnsmasq-dns-795f4db4bc-w5624\" (UID: \"bd2db7ab-0fd2-4a42-9096-518d476e670f\") " pod="openstack/dnsmasq-dns-795f4db4bc-w5624" Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.133619 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmdpg\" (UniqueName: \"kubernetes.io/projected/bd2db7ab-0fd2-4a42-9096-518d476e670f-kube-api-access-tmdpg\") pod \"dnsmasq-dns-795f4db4bc-w5624\" (UID: \"bd2db7ab-0fd2-4a42-9096-518d476e670f\") " pod="openstack/dnsmasq-dns-795f4db4bc-w5624" Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.175847 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.178025 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.188868 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.202596 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.250637 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4jwh\" (UniqueName: \"kubernetes.io/projected/5d0dbb45-c143-4635-b75a-52088142e020-kube-api-access-b4jwh\") pod \"cinder-api-0\" (UID: \"5d0dbb45-c143-4635-b75a-52088142e020\") " pod="openstack/cinder-api-0" Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.250712 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d0dbb45-c143-4635-b75a-52088142e020-logs\") pod \"cinder-api-0\" (UID: \"5d0dbb45-c143-4635-b75a-52088142e020\") " pod="openstack/cinder-api-0" Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.250742 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d0dbb45-c143-4635-b75a-52088142e020-config-data-custom\") pod \"cinder-api-0\" (UID: \"5d0dbb45-c143-4635-b75a-52088142e020\") " pod="openstack/cinder-api-0" Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.250783 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d0dbb45-c143-4635-b75a-52088142e020-scripts\") pod \"cinder-api-0\" (UID: \"5d0dbb45-c143-4635-b75a-52088142e020\") " pod="openstack/cinder-api-0" Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.250845 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d0dbb45-c143-4635-b75a-52088142e020-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5d0dbb45-c143-4635-b75a-52088142e020\") " pod="openstack/cinder-api-0" Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.250876 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5d0dbb45-c143-4635-b75a-52088142e020-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5d0dbb45-c143-4635-b75a-52088142e020\") " pod="openstack/cinder-api-0" Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.250933 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d0dbb45-c143-4635-b75a-52088142e020-config-data\") pod \"cinder-api-0\" (UID: \"5d0dbb45-c143-4635-b75a-52088142e020\") " pod="openstack/cinder-api-0" Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.352513 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4jwh\" (UniqueName: \"kubernetes.io/projected/5d0dbb45-c143-4635-b75a-52088142e020-kube-api-access-b4jwh\") pod \"cinder-api-0\" (UID: \"5d0dbb45-c143-4635-b75a-52088142e020\") " pod="openstack/cinder-api-0" Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.352579 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d0dbb45-c143-4635-b75a-52088142e020-logs\") pod \"cinder-api-0\" (UID: \"5d0dbb45-c143-4635-b75a-52088142e020\") " pod="openstack/cinder-api-0" Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.352608 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d0dbb45-c143-4635-b75a-52088142e020-config-data-custom\") pod \"cinder-api-0\" (UID: \"5d0dbb45-c143-4635-b75a-52088142e020\") " pod="openstack/cinder-api-0" Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.352632 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d0dbb45-c143-4635-b75a-52088142e020-scripts\") pod \"cinder-api-0\" (UID: \"5d0dbb45-c143-4635-b75a-52088142e020\") " pod="openstack/cinder-api-0" Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.352791 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d0dbb45-c143-4635-b75a-52088142e020-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5d0dbb45-c143-4635-b75a-52088142e020\") " pod="openstack/cinder-api-0" Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.352825 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5d0dbb45-c143-4635-b75a-52088142e020-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5d0dbb45-c143-4635-b75a-52088142e020\") " pod="openstack/cinder-api-0" Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.352861 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d0dbb45-c143-4635-b75a-52088142e020-config-data\") pod \"cinder-api-0\" (UID: \"5d0dbb45-c143-4635-b75a-52088142e020\") " pod="openstack/cinder-api-0" Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.365828 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5d0dbb45-c143-4635-b75a-52088142e020-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5d0dbb45-c143-4635-b75a-52088142e020\") " pod="openstack/cinder-api-0" Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.366333 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d0dbb45-c143-4635-b75a-52088142e020-logs\") pod \"cinder-api-0\" (UID: \"5d0dbb45-c143-4635-b75a-52088142e020\") " pod="openstack/cinder-api-0" Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.368014 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d0dbb45-c143-4635-b75a-52088142e020-config-data-custom\") pod \"cinder-api-0\" (UID: \"5d0dbb45-c143-4635-b75a-52088142e020\") " pod="openstack/cinder-api-0" Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.377273 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-795f4db4bc-w5624" Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.381733 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d0dbb45-c143-4635-b75a-52088142e020-config-data\") pod \"cinder-api-0\" (UID: \"5d0dbb45-c143-4635-b75a-52088142e020\") " pod="openstack/cinder-api-0" Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.385649 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d0dbb45-c143-4635-b75a-52088142e020-scripts\") pod \"cinder-api-0\" (UID: \"5d0dbb45-c143-4635-b75a-52088142e020\") " pod="openstack/cinder-api-0" Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.385729 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d0dbb45-c143-4635-b75a-52088142e020-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5d0dbb45-c143-4635-b75a-52088142e020\") " pod="openstack/cinder-api-0" Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.431958 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4jwh\" (UniqueName: \"kubernetes.io/projected/5d0dbb45-c143-4635-b75a-52088142e020-kube-api-access-b4jwh\") pod \"cinder-api-0\" (UID: \"5d0dbb45-c143-4635-b75a-52088142e020\") " pod="openstack/cinder-api-0" Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.515292 4699 generic.go:334] "Generic (PLEG): container finished" podID="2036e588-c3af-438b-9d44-6f77609be731" containerID="f2069c5bc1774fa1252817c3ef57116a0e1c8c8773cfb60a8d1ed883db685676" exitCode=0 Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.515630 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-f9dfc" event={"ID":"2036e588-c3af-438b-9d44-6f77609be731","Type":"ContainerDied","Data":"f2069c5bc1774fa1252817c3ef57116a0e1c8c8773cfb60a8d1ed883db685676"} Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.532850 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.672906 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-f9dfc" Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.786325 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 04:28:06 crc kubenswrapper[4699]: W1122 04:28:06.789741 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5396825b_8417_449a_90cb_c0755b9d83a4.slice/crio-b8915689f18ac708e2680563baabb7327fd4c432435e44e7543b317d5b8e20d8 WatchSource:0}: Error finding container b8915689f18ac708e2680563baabb7327fd4c432435e44e7543b317d5b8e20d8: Status 404 returned error can't find the container with id b8915689f18ac708e2680563baabb7327fd4c432435e44e7543b317d5b8e20d8 Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.861004 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2036e588-c3af-438b-9d44-6f77609be731-ovsdbserver-sb\") pod \"2036e588-c3af-438b-9d44-6f77609be731\" (UID: \"2036e588-c3af-438b-9d44-6f77609be731\") " Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.861419 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2036e588-c3af-438b-9d44-6f77609be731-dns-svc\") pod \"2036e588-c3af-438b-9d44-6f77609be731\" (UID: \"2036e588-c3af-438b-9d44-6f77609be731\") " Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.862675 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2036e588-c3af-438b-9d44-6f77609be731-dns-swift-storage-0\") pod \"2036e588-c3af-438b-9d44-6f77609be731\" (UID: \"2036e588-c3af-438b-9d44-6f77609be731\") " Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.862832 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t745z\" (UniqueName: \"kubernetes.io/projected/2036e588-c3af-438b-9d44-6f77609be731-kube-api-access-t745z\") pod \"2036e588-c3af-438b-9d44-6f77609be731\" (UID: \"2036e588-c3af-438b-9d44-6f77609be731\") " Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.862874 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2036e588-c3af-438b-9d44-6f77609be731-ovsdbserver-nb\") pod \"2036e588-c3af-438b-9d44-6f77609be731\" (UID: \"2036e588-c3af-438b-9d44-6f77609be731\") " Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.862922 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2036e588-c3af-438b-9d44-6f77609be731-config\") pod \"2036e588-c3af-438b-9d44-6f77609be731\" (UID: \"2036e588-c3af-438b-9d44-6f77609be731\") " Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.876413 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2036e588-c3af-438b-9d44-6f77609be731-kube-api-access-t745z" (OuterVolumeSpecName: "kube-api-access-t745z") pod "2036e588-c3af-438b-9d44-6f77609be731" (UID: "2036e588-c3af-438b-9d44-6f77609be731"). InnerVolumeSpecName "kube-api-access-t745z". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.930331 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2036e588-c3af-438b-9d44-6f77609be731-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2036e588-c3af-438b-9d44-6f77609be731" (UID: "2036e588-c3af-438b-9d44-6f77609be731"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.956222 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2036e588-c3af-438b-9d44-6f77609be731-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2036e588-c3af-438b-9d44-6f77609be731" (UID: "2036e588-c3af-438b-9d44-6f77609be731"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.958045 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2036e588-c3af-438b-9d44-6f77609be731-config" (OuterVolumeSpecName: "config") pod "2036e588-c3af-438b-9d44-6f77609be731" (UID: "2036e588-c3af-438b-9d44-6f77609be731"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.963071 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2036e588-c3af-438b-9d44-6f77609be731-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2036e588-c3af-438b-9d44-6f77609be731" (UID: "2036e588-c3af-438b-9d44-6f77609be731"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.965651 4699 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2036e588-c3af-438b-9d44-6f77609be731-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.965679 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t745z\" (UniqueName: \"kubernetes.io/projected/2036e588-c3af-438b-9d44-6f77609be731-kube-api-access-t745z\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.965691 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2036e588-c3af-438b-9d44-6f77609be731-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.965700 4699 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2036e588-c3af-438b-9d44-6f77609be731-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:06 crc kubenswrapper[4699]: I1122 04:28:06.965710 4699 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2036e588-c3af-438b-9d44-6f77609be731-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:07 crc kubenswrapper[4699]: I1122 04:28:07.007765 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2036e588-c3af-438b-9d44-6f77609be731-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2036e588-c3af-438b-9d44-6f77609be731" (UID: "2036e588-c3af-438b-9d44-6f77609be731"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:28:07 crc kubenswrapper[4699]: I1122 04:28:07.067618 4699 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2036e588-c3af-438b-9d44-6f77609be731-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:07 crc kubenswrapper[4699]: I1122 04:28:07.084529 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-795f4db4bc-w5624"] Nov 22 04:28:07 crc kubenswrapper[4699]: W1122 04:28:07.085930 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd2db7ab_0fd2_4a42_9096_518d476e670f.slice/crio-8eee31fcb27a82b6921428478a3181bf1ab50af18f7fa7cb87ee42e08d359db5 WatchSource:0}: Error finding container 8eee31fcb27a82b6921428478a3181bf1ab50af18f7fa7cb87ee42e08d359db5: Status 404 returned error can't find the container with id 8eee31fcb27a82b6921428478a3181bf1ab50af18f7fa7cb87ee42e08d359db5 Nov 22 04:28:07 crc kubenswrapper[4699]: I1122 04:28:07.197833 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 22 04:28:07 crc kubenswrapper[4699]: I1122 04:28:07.531755 4699 generic.go:334] "Generic (PLEG): container finished" podID="bd2db7ab-0fd2-4a42-9096-518d476e670f" containerID="bd4406f4634f9ed12a3623a5fad6e408653739b1a14d85673451ec2777bcb333" exitCode=0 Nov 22 04:28:07 crc kubenswrapper[4699]: I1122 04:28:07.532176 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-795f4db4bc-w5624" event={"ID":"bd2db7ab-0fd2-4a42-9096-518d476e670f","Type":"ContainerDied","Data":"bd4406f4634f9ed12a3623a5fad6e408653739b1a14d85673451ec2777bcb333"} Nov 22 04:28:07 crc kubenswrapper[4699]: I1122 04:28:07.532203 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-795f4db4bc-w5624" event={"ID":"bd2db7ab-0fd2-4a42-9096-518d476e670f","Type":"ContainerStarted","Data":"8eee31fcb27a82b6921428478a3181bf1ab50af18f7fa7cb87ee42e08d359db5"} Nov 22 04:28:07 crc kubenswrapper[4699]: I1122 04:28:07.537409 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ba3e973-312b-4343-a4c7-c6ab4a412703","Type":"ContainerStarted","Data":"db342c311c0039efadf8c7766dc91c793c2268e558dc0b531bc482458151bfda"} Nov 22 04:28:07 crc kubenswrapper[4699]: I1122 04:28:07.537557 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 22 04:28:07 crc kubenswrapper[4699]: I1122 04:28:07.541529 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5396825b-8417-449a-90cb-c0755b9d83a4","Type":"ContainerStarted","Data":"b8915689f18ac708e2680563baabb7327fd4c432435e44e7543b317d5b8e20d8"} Nov 22 04:28:07 crc kubenswrapper[4699]: I1122 04:28:07.544701 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-f9dfc" event={"ID":"2036e588-c3af-438b-9d44-6f77609be731","Type":"ContainerDied","Data":"05069332dfa78bfe28378f5fa711fb68043772b38d0d366ce32eb2851ce353dd"} Nov 22 04:28:07 crc kubenswrapper[4699]: I1122 04:28:07.544765 4699 scope.go:117] "RemoveContainer" containerID="f2069c5bc1774fa1252817c3ef57116a0e1c8c8773cfb60a8d1ed883db685676" Nov 22 04:28:07 crc kubenswrapper[4699]: I1122 04:28:07.544940 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-f9dfc" Nov 22 04:28:07 crc kubenswrapper[4699]: I1122 04:28:07.562168 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5d0dbb45-c143-4635-b75a-52088142e020","Type":"ContainerStarted","Data":"003291c325a0fa20571c766e237b2759e35db90215ce62c2dfff1cf704459c27"} Nov 22 04:28:07 crc kubenswrapper[4699]: I1122 04:28:07.582146 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-f9dfc"] Nov 22 04:28:07 crc kubenswrapper[4699]: I1122 04:28:07.593304 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-f9dfc"] Nov 22 04:28:07 crc kubenswrapper[4699]: I1122 04:28:07.616747 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.801471622 podStartE2EDuration="8.616730727s" podCreationTimestamp="2025-11-22 04:27:59 +0000 UTC" firstStartedPulling="2025-11-22 04:28:00.893110185 +0000 UTC m=+1232.235731372" lastFinishedPulling="2025-11-22 04:28:05.70836929 +0000 UTC m=+1237.050990477" observedRunningTime="2025-11-22 04:28:07.615900567 +0000 UTC m=+1238.958521754" watchObservedRunningTime="2025-11-22 04:28:07.616730727 +0000 UTC m=+1238.959351914" Nov 22 04:28:07 crc kubenswrapper[4699]: I1122 04:28:07.617733 4699 scope.go:117] "RemoveContainer" containerID="5c0440816663cc10dbc565ece19ea2612dff9a0d4858a3efbbe40010bc032de9" Nov 22 04:28:08 crc kubenswrapper[4699]: I1122 04:28:08.428425 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 22 04:28:08 crc kubenswrapper[4699]: E1122 04:28:08.572094 4699 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod417b0282_cef1_4a7c_aca5_593297254fe3.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod417b0282_cef1_4a7c_aca5_593297254fe3.slice/crio-d9be3bb7fa7a0781dffe600d9a6395a984bfdc0273d8e5fcae602ef27fcca7e9\": RecentStats: unable to find data in memory cache]" Nov 22 04:28:08 crc kubenswrapper[4699]: I1122 04:28:08.669207 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-795f4db4bc-w5624" event={"ID":"bd2db7ab-0fd2-4a42-9096-518d476e670f","Type":"ContainerStarted","Data":"2b0f51586de1eb03dd61de946ae85df2280cb312373efdfe5dbccf61cccc4303"} Nov 22 04:28:08 crc kubenswrapper[4699]: I1122 04:28:08.669575 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-795f4db4bc-w5624" Nov 22 04:28:08 crc kubenswrapper[4699]: I1122 04:28:08.675714 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5396825b-8417-449a-90cb-c0755b9d83a4","Type":"ContainerStarted","Data":"c6063467628fb4837a805b00a933593595ff173bc8d1851e969145acc47224fe"} Nov 22 04:28:08 crc kubenswrapper[4699]: I1122 04:28:08.681333 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5d0dbb45-c143-4635-b75a-52088142e020","Type":"ContainerStarted","Data":"7d3feada13c953a447682e0a49321d848996b6e8f5893960c0c2e2685edfc945"} Nov 22 04:28:08 crc kubenswrapper[4699]: I1122 04:28:08.726534 4699 patch_prober.go:28] interesting pod/machine-config-daemon-kjwnt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:28:08 crc kubenswrapper[4699]: I1122 04:28:08.726595 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:28:09 crc kubenswrapper[4699]: I1122 04:28:09.458924 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2036e588-c3af-438b-9d44-6f77609be731" path="/var/lib/kubelet/pods/2036e588-c3af-438b-9d44-6f77609be731/volumes" Nov 22 04:28:09 crc kubenswrapper[4699]: I1122 04:28:09.474804 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-795f4db4bc-w5624" podStartSLOduration=4.4747827860000005 podStartE2EDuration="4.474782786s" podCreationTimestamp="2025-11-22 04:28:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:28:08.696839385 +0000 UTC m=+1240.039460582" watchObservedRunningTime="2025-11-22 04:28:09.474782786 +0000 UTC m=+1240.817403973" Nov 22 04:28:09 crc kubenswrapper[4699]: I1122 04:28:09.689748 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5d0dbb45-c143-4635-b75a-52088142e020","Type":"ContainerStarted","Data":"3c4d74fea31f812028723babfa084b2d00d36c483fa1ca175e12733c9bd6e96e"} Nov 22 04:28:09 crc kubenswrapper[4699]: I1122 04:28:09.689898 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="5d0dbb45-c143-4635-b75a-52088142e020" containerName="cinder-api-log" containerID="cri-o://7d3feada13c953a447682e0a49321d848996b6e8f5893960c0c2e2685edfc945" gracePeriod=30 Nov 22 04:28:09 crc kubenswrapper[4699]: I1122 04:28:09.690133 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Nov 22 04:28:09 crc kubenswrapper[4699]: I1122 04:28:09.690345 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="5d0dbb45-c143-4635-b75a-52088142e020" containerName="cinder-api" containerID="cri-o://3c4d74fea31f812028723babfa084b2d00d36c483fa1ca175e12733c9bd6e96e" gracePeriod=30 Nov 22 04:28:09 crc kubenswrapper[4699]: I1122 04:28:09.696848 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5396825b-8417-449a-90cb-c0755b9d83a4","Type":"ContainerStarted","Data":"eaf34eade050fc67216bfc310dd009497306c3245680da6401a1624a02697d32"} Nov 22 04:28:09 crc kubenswrapper[4699]: I1122 04:28:09.716363 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.716348892 podStartE2EDuration="3.716348892s" podCreationTimestamp="2025-11-22 04:28:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:28:09.710984111 +0000 UTC m=+1241.053605298" watchObservedRunningTime="2025-11-22 04:28:09.716348892 +0000 UTC m=+1241.058970079" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.338119 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.364187 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.5446876 podStartE2EDuration="5.364166078s" podCreationTimestamp="2025-11-22 04:28:05 +0000 UTC" firstStartedPulling="2025-11-22 04:28:06.793057407 +0000 UTC m=+1238.135678584" lastFinishedPulling="2025-11-22 04:28:07.612535875 +0000 UTC m=+1238.955157062" observedRunningTime="2025-11-22 04:28:09.742913636 +0000 UTC m=+1241.085534813" watchObservedRunningTime="2025-11-22 04:28:10.364166078 +0000 UTC m=+1241.706787265" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.476630 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5d0dbb45-c143-4635-b75a-52088142e020-etc-machine-id\") pod \"5d0dbb45-c143-4635-b75a-52088142e020\" (UID: \"5d0dbb45-c143-4635-b75a-52088142e020\") " Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.476723 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d0dbb45-c143-4635-b75a-52088142e020-scripts\") pod \"5d0dbb45-c143-4635-b75a-52088142e020\" (UID: \"5d0dbb45-c143-4635-b75a-52088142e020\") " Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.476759 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5d0dbb45-c143-4635-b75a-52088142e020-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "5d0dbb45-c143-4635-b75a-52088142e020" (UID: "5d0dbb45-c143-4635-b75a-52088142e020"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.476842 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d0dbb45-c143-4635-b75a-52088142e020-config-data-custom\") pod \"5d0dbb45-c143-4635-b75a-52088142e020\" (UID: \"5d0dbb45-c143-4635-b75a-52088142e020\") " Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.476871 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d0dbb45-c143-4635-b75a-52088142e020-combined-ca-bundle\") pod \"5d0dbb45-c143-4635-b75a-52088142e020\" (UID: \"5d0dbb45-c143-4635-b75a-52088142e020\") " Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.476897 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4jwh\" (UniqueName: \"kubernetes.io/projected/5d0dbb45-c143-4635-b75a-52088142e020-kube-api-access-b4jwh\") pod \"5d0dbb45-c143-4635-b75a-52088142e020\" (UID: \"5d0dbb45-c143-4635-b75a-52088142e020\") " Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.477048 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d0dbb45-c143-4635-b75a-52088142e020-config-data\") pod \"5d0dbb45-c143-4635-b75a-52088142e020\" (UID: \"5d0dbb45-c143-4635-b75a-52088142e020\") " Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.477096 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d0dbb45-c143-4635-b75a-52088142e020-logs\") pod \"5d0dbb45-c143-4635-b75a-52088142e020\" (UID: \"5d0dbb45-c143-4635-b75a-52088142e020\") " Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.477518 4699 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5d0dbb45-c143-4635-b75a-52088142e020-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.477843 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d0dbb45-c143-4635-b75a-52088142e020-logs" (OuterVolumeSpecName: "logs") pod "5d0dbb45-c143-4635-b75a-52088142e020" (UID: "5d0dbb45-c143-4635-b75a-52088142e020"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.483726 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d0dbb45-c143-4635-b75a-52088142e020-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5d0dbb45-c143-4635-b75a-52088142e020" (UID: "5d0dbb45-c143-4635-b75a-52088142e020"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.486314 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d0dbb45-c143-4635-b75a-52088142e020-kube-api-access-b4jwh" (OuterVolumeSpecName: "kube-api-access-b4jwh") pod "5d0dbb45-c143-4635-b75a-52088142e020" (UID: "5d0dbb45-c143-4635-b75a-52088142e020"). InnerVolumeSpecName "kube-api-access-b4jwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.499279 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d0dbb45-c143-4635-b75a-52088142e020-scripts" (OuterVolumeSpecName: "scripts") pod "5d0dbb45-c143-4635-b75a-52088142e020" (UID: "5d0dbb45-c143-4635-b75a-52088142e020"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.527684 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d0dbb45-c143-4635-b75a-52088142e020-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d0dbb45-c143-4635-b75a-52088142e020" (UID: "5d0dbb45-c143-4635-b75a-52088142e020"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.567749 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-76d968d474-5l6z2" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.579657 4699 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d0dbb45-c143-4635-b75a-52088142e020-logs\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.581135 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d0dbb45-c143-4635-b75a-52088142e020-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.581233 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d0dbb45-c143-4635-b75a-52088142e020-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.581297 4699 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d0dbb45-c143-4635-b75a-52088142e020-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.581353 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4jwh\" (UniqueName: \"kubernetes.io/projected/5d0dbb45-c143-4635-b75a-52088142e020-kube-api-access-b4jwh\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.589953 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d0dbb45-c143-4635-b75a-52088142e020-config-data" (OuterVolumeSpecName: "config-data") pod "5d0dbb45-c143-4635-b75a-52088142e020" (UID: "5d0dbb45-c143-4635-b75a-52088142e020"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.682747 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d0dbb45-c143-4635-b75a-52088142e020-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.710539 4699 generic.go:334] "Generic (PLEG): container finished" podID="5d0dbb45-c143-4635-b75a-52088142e020" containerID="3c4d74fea31f812028723babfa084b2d00d36c483fa1ca175e12733c9bd6e96e" exitCode=0 Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.710595 4699 generic.go:334] "Generic (PLEG): container finished" podID="5d0dbb45-c143-4635-b75a-52088142e020" containerID="7d3feada13c953a447682e0a49321d848996b6e8f5893960c0c2e2685edfc945" exitCode=143 Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.710625 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.710680 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5d0dbb45-c143-4635-b75a-52088142e020","Type":"ContainerDied","Data":"3c4d74fea31f812028723babfa084b2d00d36c483fa1ca175e12733c9bd6e96e"} Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.710708 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5d0dbb45-c143-4635-b75a-52088142e020","Type":"ContainerDied","Data":"7d3feada13c953a447682e0a49321d848996b6e8f5893960c0c2e2685edfc945"} Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.710718 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5d0dbb45-c143-4635-b75a-52088142e020","Type":"ContainerDied","Data":"003291c325a0fa20571c766e237b2759e35db90215ce62c2dfff1cf704459c27"} Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.710734 4699 scope.go:117] "RemoveContainer" containerID="3c4d74fea31f812028723babfa084b2d00d36c483fa1ca175e12733c9bd6e96e" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.748799 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.752827 4699 scope.go:117] "RemoveContainer" containerID="7d3feada13c953a447682e0a49321d848996b6e8f5893960c0c2e2685edfc945" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.758821 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.778042 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Nov 22 04:28:10 crc kubenswrapper[4699]: E1122 04:28:10.778508 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2036e588-c3af-438b-9d44-6f77609be731" containerName="init" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.778530 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="2036e588-c3af-438b-9d44-6f77609be731" containerName="init" Nov 22 04:28:10 crc kubenswrapper[4699]: E1122 04:28:10.778547 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d0dbb45-c143-4635-b75a-52088142e020" containerName="cinder-api" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.778554 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d0dbb45-c143-4635-b75a-52088142e020" containerName="cinder-api" Nov 22 04:28:10 crc kubenswrapper[4699]: E1122 04:28:10.778564 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d0dbb45-c143-4635-b75a-52088142e020" containerName="cinder-api-log" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.778570 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d0dbb45-c143-4635-b75a-52088142e020" containerName="cinder-api-log" Nov 22 04:28:10 crc kubenswrapper[4699]: E1122 04:28:10.778599 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2036e588-c3af-438b-9d44-6f77609be731" containerName="dnsmasq-dns" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.778604 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="2036e588-c3af-438b-9d44-6f77609be731" containerName="dnsmasq-dns" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.778787 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d0dbb45-c143-4635-b75a-52088142e020" containerName="cinder-api" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.778806 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="2036e588-c3af-438b-9d44-6f77609be731" containerName="dnsmasq-dns" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.778829 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d0dbb45-c143-4635-b75a-52088142e020" containerName="cinder-api-log" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.779933 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.785826 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.786377 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.802588 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-76d968d474-5l6z2" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.803021 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.803633 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.831651 4699 scope.go:117] "RemoveContainer" containerID="3c4d74fea31f812028723babfa084b2d00d36c483fa1ca175e12733c9bd6e96e" Nov 22 04:28:10 crc kubenswrapper[4699]: E1122 04:28:10.833826 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c4d74fea31f812028723babfa084b2d00d36c483fa1ca175e12733c9bd6e96e\": container with ID starting with 3c4d74fea31f812028723babfa084b2d00d36c483fa1ca175e12733c9bd6e96e not found: ID does not exist" containerID="3c4d74fea31f812028723babfa084b2d00d36c483fa1ca175e12733c9bd6e96e" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.833864 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c4d74fea31f812028723babfa084b2d00d36c483fa1ca175e12733c9bd6e96e"} err="failed to get container status \"3c4d74fea31f812028723babfa084b2d00d36c483fa1ca175e12733c9bd6e96e\": rpc error: code = NotFound desc = could not find container \"3c4d74fea31f812028723babfa084b2d00d36c483fa1ca175e12733c9bd6e96e\": container with ID starting with 3c4d74fea31f812028723babfa084b2d00d36c483fa1ca175e12733c9bd6e96e not found: ID does not exist" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.833886 4699 scope.go:117] "RemoveContainer" containerID="7d3feada13c953a447682e0a49321d848996b6e8f5893960c0c2e2685edfc945" Nov 22 04:28:10 crc kubenswrapper[4699]: E1122 04:28:10.834639 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d3feada13c953a447682e0a49321d848996b6e8f5893960c0c2e2685edfc945\": container with ID starting with 7d3feada13c953a447682e0a49321d848996b6e8f5893960c0c2e2685edfc945 not found: ID does not exist" containerID="7d3feada13c953a447682e0a49321d848996b6e8f5893960c0c2e2685edfc945" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.834685 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d3feada13c953a447682e0a49321d848996b6e8f5893960c0c2e2685edfc945"} err="failed to get container status \"7d3feada13c953a447682e0a49321d848996b6e8f5893960c0c2e2685edfc945\": rpc error: code = NotFound desc = could not find container \"7d3feada13c953a447682e0a49321d848996b6e8f5893960c0c2e2685edfc945\": container with ID starting with 7d3feada13c953a447682e0a49321d848996b6e8f5893960c0c2e2685edfc945 not found: ID does not exist" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.834715 4699 scope.go:117] "RemoveContainer" containerID="3c4d74fea31f812028723babfa084b2d00d36c483fa1ca175e12733c9bd6e96e" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.835034 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c4d74fea31f812028723babfa084b2d00d36c483fa1ca175e12733c9bd6e96e"} err="failed to get container status \"3c4d74fea31f812028723babfa084b2d00d36c483fa1ca175e12733c9bd6e96e\": rpc error: code = NotFound desc = could not find container \"3c4d74fea31f812028723babfa084b2d00d36c483fa1ca175e12733c9bd6e96e\": container with ID starting with 3c4d74fea31f812028723babfa084b2d00d36c483fa1ca175e12733c9bd6e96e not found: ID does not exist" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.835053 4699 scope.go:117] "RemoveContainer" containerID="7d3feada13c953a447682e0a49321d848996b6e8f5893960c0c2e2685edfc945" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.839606 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d3feada13c953a447682e0a49321d848996b6e8f5893960c0c2e2685edfc945"} err="failed to get container status \"7d3feada13c953a447682e0a49321d848996b6e8f5893960c0c2e2685edfc945\": rpc error: code = NotFound desc = could not find container \"7d3feada13c953a447682e0a49321d848996b6e8f5893960c0c2e2685edfc945\": container with ID starting with 7d3feada13c953a447682e0a49321d848996b6e8f5893960c0c2e2685edfc945 not found: ID does not exist" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.886991 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a79d9f7b-c6a8-44bc-a2c7-65467492cff2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a79d9f7b-c6a8-44bc-a2c7-65467492cff2\") " pod="openstack/cinder-api-0" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.887028 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a79d9f7b-c6a8-44bc-a2c7-65467492cff2-scripts\") pod \"cinder-api-0\" (UID: \"a79d9f7b-c6a8-44bc-a2c7-65467492cff2\") " pod="openstack/cinder-api-0" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.887053 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgt4r\" (UniqueName: \"kubernetes.io/projected/a79d9f7b-c6a8-44bc-a2c7-65467492cff2-kube-api-access-lgt4r\") pod \"cinder-api-0\" (UID: \"a79d9f7b-c6a8-44bc-a2c7-65467492cff2\") " pod="openstack/cinder-api-0" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.887110 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a79d9f7b-c6a8-44bc-a2c7-65467492cff2-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a79d9f7b-c6a8-44bc-a2c7-65467492cff2\") " pod="openstack/cinder-api-0" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.887141 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a79d9f7b-c6a8-44bc-a2c7-65467492cff2-config-data-custom\") pod \"cinder-api-0\" (UID: \"a79d9f7b-c6a8-44bc-a2c7-65467492cff2\") " pod="openstack/cinder-api-0" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.887166 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a79d9f7b-c6a8-44bc-a2c7-65467492cff2-config-data\") pod \"cinder-api-0\" (UID: \"a79d9f7b-c6a8-44bc-a2c7-65467492cff2\") " pod="openstack/cinder-api-0" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.887307 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a79d9f7b-c6a8-44bc-a2c7-65467492cff2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a79d9f7b-c6a8-44bc-a2c7-65467492cff2\") " pod="openstack/cinder-api-0" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.887348 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a79d9f7b-c6a8-44bc-a2c7-65467492cff2-logs\") pod \"cinder-api-0\" (UID: \"a79d9f7b-c6a8-44bc-a2c7-65467492cff2\") " pod="openstack/cinder-api-0" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.887370 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a79d9f7b-c6a8-44bc-a2c7-65467492cff2-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a79d9f7b-c6a8-44bc-a2c7-65467492cff2\") " pod="openstack/cinder-api-0" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.988794 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a79d9f7b-c6a8-44bc-a2c7-65467492cff2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a79d9f7b-c6a8-44bc-a2c7-65467492cff2\") " pod="openstack/cinder-api-0" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.988869 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a79d9f7b-c6a8-44bc-a2c7-65467492cff2-logs\") pod \"cinder-api-0\" (UID: \"a79d9f7b-c6a8-44bc-a2c7-65467492cff2\") " pod="openstack/cinder-api-0" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.988893 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a79d9f7b-c6a8-44bc-a2c7-65467492cff2-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a79d9f7b-c6a8-44bc-a2c7-65467492cff2\") " pod="openstack/cinder-api-0" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.988971 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a79d9f7b-c6a8-44bc-a2c7-65467492cff2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a79d9f7b-c6a8-44bc-a2c7-65467492cff2\") " pod="openstack/cinder-api-0" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.988991 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a79d9f7b-c6a8-44bc-a2c7-65467492cff2-scripts\") pod \"cinder-api-0\" (UID: \"a79d9f7b-c6a8-44bc-a2c7-65467492cff2\") " pod="openstack/cinder-api-0" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.989011 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgt4r\" (UniqueName: \"kubernetes.io/projected/a79d9f7b-c6a8-44bc-a2c7-65467492cff2-kube-api-access-lgt4r\") pod \"cinder-api-0\" (UID: \"a79d9f7b-c6a8-44bc-a2c7-65467492cff2\") " pod="openstack/cinder-api-0" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.989058 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a79d9f7b-c6a8-44bc-a2c7-65467492cff2-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a79d9f7b-c6a8-44bc-a2c7-65467492cff2\") " pod="openstack/cinder-api-0" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.989092 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a79d9f7b-c6a8-44bc-a2c7-65467492cff2-config-data-custom\") pod \"cinder-api-0\" (UID: \"a79d9f7b-c6a8-44bc-a2c7-65467492cff2\") " pod="openstack/cinder-api-0" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.989121 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a79d9f7b-c6a8-44bc-a2c7-65467492cff2-config-data\") pod \"cinder-api-0\" (UID: \"a79d9f7b-c6a8-44bc-a2c7-65467492cff2\") " pod="openstack/cinder-api-0" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.989662 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a79d9f7b-c6a8-44bc-a2c7-65467492cff2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a79d9f7b-c6a8-44bc-a2c7-65467492cff2\") " pod="openstack/cinder-api-0" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.990167 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a79d9f7b-c6a8-44bc-a2c7-65467492cff2-logs\") pod \"cinder-api-0\" (UID: \"a79d9f7b-c6a8-44bc-a2c7-65467492cff2\") " pod="openstack/cinder-api-0" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.992991 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a79d9f7b-c6a8-44bc-a2c7-65467492cff2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a79d9f7b-c6a8-44bc-a2c7-65467492cff2\") " pod="openstack/cinder-api-0" Nov 22 04:28:10 crc kubenswrapper[4699]: I1122 04:28:10.998123 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a79d9f7b-c6a8-44bc-a2c7-65467492cff2-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a79d9f7b-c6a8-44bc-a2c7-65467492cff2\") " pod="openstack/cinder-api-0" Nov 22 04:28:11 crc kubenswrapper[4699]: I1122 04:28:11.001080 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a79d9f7b-c6a8-44bc-a2c7-65467492cff2-scripts\") pod \"cinder-api-0\" (UID: \"a79d9f7b-c6a8-44bc-a2c7-65467492cff2\") " pod="openstack/cinder-api-0" Nov 22 04:28:11 crc kubenswrapper[4699]: I1122 04:28:11.006282 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a79d9f7b-c6a8-44bc-a2c7-65467492cff2-config-data-custom\") pod \"cinder-api-0\" (UID: \"a79d9f7b-c6a8-44bc-a2c7-65467492cff2\") " pod="openstack/cinder-api-0" Nov 22 04:28:11 crc kubenswrapper[4699]: I1122 04:28:11.011099 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a79d9f7b-c6a8-44bc-a2c7-65467492cff2-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a79d9f7b-c6a8-44bc-a2c7-65467492cff2\") " pod="openstack/cinder-api-0" Nov 22 04:28:11 crc kubenswrapper[4699]: I1122 04:28:11.011991 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a79d9f7b-c6a8-44bc-a2c7-65467492cff2-config-data\") pod \"cinder-api-0\" (UID: \"a79d9f7b-c6a8-44bc-a2c7-65467492cff2\") " pod="openstack/cinder-api-0" Nov 22 04:28:11 crc kubenswrapper[4699]: I1122 04:28:11.018055 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgt4r\" (UniqueName: \"kubernetes.io/projected/a79d9f7b-c6a8-44bc-a2c7-65467492cff2-kube-api-access-lgt4r\") pod \"cinder-api-0\" (UID: \"a79d9f7b-c6a8-44bc-a2c7-65467492cff2\") " pod="openstack/cinder-api-0" Nov 22 04:28:11 crc kubenswrapper[4699]: I1122 04:28:11.031401 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Nov 22 04:28:11 crc kubenswrapper[4699]: I1122 04:28:11.199750 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 22 04:28:11 crc kubenswrapper[4699]: I1122 04:28:11.494357 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d0dbb45-c143-4635-b75a-52088142e020" path="/var/lib/kubelet/pods/5d0dbb45-c143-4635-b75a-52088142e020/volumes" Nov 22 04:28:11 crc kubenswrapper[4699]: I1122 04:28:11.721472 4699 generic.go:334] "Generic (PLEG): container finished" podID="5c7883b3-956a-412b-87b7-f7366042440b" containerID="f89adf7b40fa021a0f196d11b77ffea1655dd93826c756566309dfba9ac95472" exitCode=0 Nov 22 04:28:11 crc kubenswrapper[4699]: I1122 04:28:11.721564 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-dhclj" event={"ID":"5c7883b3-956a-412b-87b7-f7366042440b","Type":"ContainerDied","Data":"f89adf7b40fa021a0f196d11b77ffea1655dd93826c756566309dfba9ac95472"} Nov 22 04:28:11 crc kubenswrapper[4699]: I1122 04:28:11.785760 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 22 04:28:11 crc kubenswrapper[4699]: W1122 04:28:11.795627 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda79d9f7b_c6a8_44bc_a2c7_65467492cff2.slice/crio-e5f645cefd6a159938fbebf299bbbd2b533eb9e43f5d4aa7e6e5acee500c7526 WatchSource:0}: Error finding container e5f645cefd6a159938fbebf299bbbd2b533eb9e43f5d4aa7e6e5acee500c7526: Status 404 returned error can't find the container with id e5f645cefd6a159938fbebf299bbbd2b533eb9e43f5d4aa7e6e5acee500c7526 Nov 22 04:28:12 crc kubenswrapper[4699]: I1122 04:28:12.608291 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7f47885746-l8msw" Nov 22 04:28:12 crc kubenswrapper[4699]: I1122 04:28:12.674506 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7f47885746-l8msw" Nov 22 04:28:12 crc kubenswrapper[4699]: I1122 04:28:12.736568 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-76d968d474-5l6z2"] Nov 22 04:28:12 crc kubenswrapper[4699]: I1122 04:28:12.736782 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-76d968d474-5l6z2" podUID="eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe" containerName="barbican-api-log" containerID="cri-o://8382c65c1883c39f5a96b59663fbbf89ac9a4d17fa18641974495ad18a2a8ced" gracePeriod=30 Nov 22 04:28:12 crc kubenswrapper[4699]: I1122 04:28:12.737158 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-76d968d474-5l6z2" podUID="eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe" containerName="barbican-api" containerID="cri-o://eefbea6d5fb036522c9edff0ec95f12837269e0a104e6d5bb2dec3833687276a" gracePeriod=30 Nov 22 04:28:12 crc kubenswrapper[4699]: I1122 04:28:12.800455 4699 generic.go:334] "Generic (PLEG): container finished" podID="19251598-5cdb-4e4f-9eb7-05cd21d988fb" containerID="d52ea8982be66097720e91090ec7a19a59fb12cb7bf19c4c25f043ea53daf19c" exitCode=0 Nov 22 04:28:12 crc kubenswrapper[4699]: I1122 04:28:12.800547 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-bd6j2" event={"ID":"19251598-5cdb-4e4f-9eb7-05cd21d988fb","Type":"ContainerDied","Data":"d52ea8982be66097720e91090ec7a19a59fb12cb7bf19c4c25f043ea53daf19c"} Nov 22 04:28:12 crc kubenswrapper[4699]: I1122 04:28:12.824423 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a79d9f7b-c6a8-44bc-a2c7-65467492cff2","Type":"ContainerStarted","Data":"5d30baab08fb3c21b8d5525f6353e8f3e2ad5dcbcee6bdb68b464f0b07f57f90"} Nov 22 04:28:12 crc kubenswrapper[4699]: I1122 04:28:12.824487 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a79d9f7b-c6a8-44bc-a2c7-65467492cff2","Type":"ContainerStarted","Data":"e5f645cefd6a159938fbebf299bbbd2b533eb9e43f5d4aa7e6e5acee500c7526"} Nov 22 04:28:13 crc kubenswrapper[4699]: I1122 04:28:13.229039 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-dhclj" Nov 22 04:28:13 crc kubenswrapper[4699]: I1122 04:28:13.330870 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5c7883b3-956a-412b-87b7-f7366042440b-config\") pod \"5c7883b3-956a-412b-87b7-f7366042440b\" (UID: \"5c7883b3-956a-412b-87b7-f7366042440b\") " Nov 22 04:28:13 crc kubenswrapper[4699]: I1122 04:28:13.331054 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c7883b3-956a-412b-87b7-f7366042440b-combined-ca-bundle\") pod \"5c7883b3-956a-412b-87b7-f7366042440b\" (UID: \"5c7883b3-956a-412b-87b7-f7366042440b\") " Nov 22 04:28:13 crc kubenswrapper[4699]: I1122 04:28:13.331128 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjjbn\" (UniqueName: \"kubernetes.io/projected/5c7883b3-956a-412b-87b7-f7366042440b-kube-api-access-fjjbn\") pod \"5c7883b3-956a-412b-87b7-f7366042440b\" (UID: \"5c7883b3-956a-412b-87b7-f7366042440b\") " Nov 22 04:28:13 crc kubenswrapper[4699]: I1122 04:28:13.342682 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c7883b3-956a-412b-87b7-f7366042440b-kube-api-access-fjjbn" (OuterVolumeSpecName: "kube-api-access-fjjbn") pod "5c7883b3-956a-412b-87b7-f7366042440b" (UID: "5c7883b3-956a-412b-87b7-f7366042440b"). InnerVolumeSpecName "kube-api-access-fjjbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:28:13 crc kubenswrapper[4699]: I1122 04:28:13.367559 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c7883b3-956a-412b-87b7-f7366042440b-config" (OuterVolumeSpecName: "config") pod "5c7883b3-956a-412b-87b7-f7366042440b" (UID: "5c7883b3-956a-412b-87b7-f7366042440b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:28:13 crc kubenswrapper[4699]: I1122 04:28:13.367616 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c7883b3-956a-412b-87b7-f7366042440b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c7883b3-956a-412b-87b7-f7366042440b" (UID: "5c7883b3-956a-412b-87b7-f7366042440b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:28:13 crc kubenswrapper[4699]: I1122 04:28:13.433081 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c7883b3-956a-412b-87b7-f7366042440b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:13 crc kubenswrapper[4699]: I1122 04:28:13.433122 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjjbn\" (UniqueName: \"kubernetes.io/projected/5c7883b3-956a-412b-87b7-f7366042440b-kube-api-access-fjjbn\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:13 crc kubenswrapper[4699]: I1122 04:28:13.433135 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/5c7883b3-956a-412b-87b7-f7366042440b-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:13 crc kubenswrapper[4699]: I1122 04:28:13.834750 4699 generic.go:334] "Generic (PLEG): container finished" podID="eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe" containerID="8382c65c1883c39f5a96b59663fbbf89ac9a4d17fa18641974495ad18a2a8ced" exitCode=143 Nov 22 04:28:13 crc kubenswrapper[4699]: I1122 04:28:13.834827 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76d968d474-5l6z2" event={"ID":"eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe","Type":"ContainerDied","Data":"8382c65c1883c39f5a96b59663fbbf89ac9a4d17fa18641974495ad18a2a8ced"} Nov 22 04:28:13 crc kubenswrapper[4699]: I1122 04:28:13.837074 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a79d9f7b-c6a8-44bc-a2c7-65467492cff2","Type":"ContainerStarted","Data":"082fb8bfa4eb334a8ab57453b8d26bf5c16cb40004ade7f24eae1c8aac8cadb4"} Nov 22 04:28:13 crc kubenswrapper[4699]: I1122 04:28:13.837314 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Nov 22 04:28:13 crc kubenswrapper[4699]: I1122 04:28:13.838897 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-dhclj" Nov 22 04:28:13 crc kubenswrapper[4699]: I1122 04:28:13.839605 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-dhclj" event={"ID":"5c7883b3-956a-412b-87b7-f7366042440b","Type":"ContainerDied","Data":"c4cbbbf8bef678e23067891780026b13f45888a86a8cfab0aa6b9b8c98866377"} Nov 22 04:28:13 crc kubenswrapper[4699]: I1122 04:28:13.839634 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4cbbbf8bef678e23067891780026b13f45888a86a8cfab0aa6b9b8c98866377" Nov 22 04:28:13 crc kubenswrapper[4699]: I1122 04:28:13.899241 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.898535768 podStartE2EDuration="3.898535768s" podCreationTimestamp="2025-11-22 04:28:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:28:13.883866212 +0000 UTC m=+1245.226487409" watchObservedRunningTime="2025-11-22 04:28:13.898535768 +0000 UTC m=+1245.241156955" Nov 22 04:28:13 crc kubenswrapper[4699]: I1122 04:28:13.942971 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-795f4db4bc-w5624"] Nov 22 04:28:13 crc kubenswrapper[4699]: I1122 04:28:13.943194 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-795f4db4bc-w5624" podUID="bd2db7ab-0fd2-4a42-9096-518d476e670f" containerName="dnsmasq-dns" containerID="cri-o://2b0f51586de1eb03dd61de946ae85df2280cb312373efdfe5dbccf61cccc4303" gracePeriod=10 Nov 22 04:28:13 crc kubenswrapper[4699]: I1122 04:28:13.950646 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-795f4db4bc-w5624" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.008503 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-84dp4"] Nov 22 04:28:14 crc kubenswrapper[4699]: E1122 04:28:14.009081 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c7883b3-956a-412b-87b7-f7366042440b" containerName="neutron-db-sync" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.009099 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c7883b3-956a-412b-87b7-f7366042440b" containerName="neutron-db-sync" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.009326 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c7883b3-956a-412b-87b7-f7366042440b" containerName="neutron-db-sync" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.010580 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-84dp4" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.033075 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-84dp4"] Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.045945 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/96a27084-0580-4b64-9cde-906db9a6f231-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-84dp4\" (UID: \"96a27084-0580-4b64-9cde-906db9a6f231\") " pod="openstack/dnsmasq-dns-5c9776ccc5-84dp4" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.046214 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/96a27084-0580-4b64-9cde-906db9a6f231-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-84dp4\" (UID: \"96a27084-0580-4b64-9cde-906db9a6f231\") " pod="openstack/dnsmasq-dns-5c9776ccc5-84dp4" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.046305 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/96a27084-0580-4b64-9cde-906db9a6f231-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-84dp4\" (UID: \"96a27084-0580-4b64-9cde-906db9a6f231\") " pod="openstack/dnsmasq-dns-5c9776ccc5-84dp4" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.046415 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/96a27084-0580-4b64-9cde-906db9a6f231-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-84dp4\" (UID: \"96a27084-0580-4b64-9cde-906db9a6f231\") " pod="openstack/dnsmasq-dns-5c9776ccc5-84dp4" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.048719 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96a27084-0580-4b64-9cde-906db9a6f231-config\") pod \"dnsmasq-dns-5c9776ccc5-84dp4\" (UID: \"96a27084-0580-4b64-9cde-906db9a6f231\") " pod="openstack/dnsmasq-dns-5c9776ccc5-84dp4" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.048967 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22bq2\" (UniqueName: \"kubernetes.io/projected/96a27084-0580-4b64-9cde-906db9a6f231-kube-api-access-22bq2\") pod \"dnsmasq-dns-5c9776ccc5-84dp4\" (UID: \"96a27084-0580-4b64-9cde-906db9a6f231\") " pod="openstack/dnsmasq-dns-5c9776ccc5-84dp4" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.150616 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/96a27084-0580-4b64-9cde-906db9a6f231-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-84dp4\" (UID: \"96a27084-0580-4b64-9cde-906db9a6f231\") " pod="openstack/dnsmasq-dns-5c9776ccc5-84dp4" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.150961 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/96a27084-0580-4b64-9cde-906db9a6f231-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-84dp4\" (UID: \"96a27084-0580-4b64-9cde-906db9a6f231\") " pod="openstack/dnsmasq-dns-5c9776ccc5-84dp4" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.151027 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/96a27084-0580-4b64-9cde-906db9a6f231-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-84dp4\" (UID: \"96a27084-0580-4b64-9cde-906db9a6f231\") " pod="openstack/dnsmasq-dns-5c9776ccc5-84dp4" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.151045 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96a27084-0580-4b64-9cde-906db9a6f231-config\") pod \"dnsmasq-dns-5c9776ccc5-84dp4\" (UID: \"96a27084-0580-4b64-9cde-906db9a6f231\") " pod="openstack/dnsmasq-dns-5c9776ccc5-84dp4" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.151109 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22bq2\" (UniqueName: \"kubernetes.io/projected/96a27084-0580-4b64-9cde-906db9a6f231-kube-api-access-22bq2\") pod \"dnsmasq-dns-5c9776ccc5-84dp4\" (UID: \"96a27084-0580-4b64-9cde-906db9a6f231\") " pod="openstack/dnsmasq-dns-5c9776ccc5-84dp4" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.151158 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/96a27084-0580-4b64-9cde-906db9a6f231-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-84dp4\" (UID: \"96a27084-0580-4b64-9cde-906db9a6f231\") " pod="openstack/dnsmasq-dns-5c9776ccc5-84dp4" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.151788 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/96a27084-0580-4b64-9cde-906db9a6f231-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-84dp4\" (UID: \"96a27084-0580-4b64-9cde-906db9a6f231\") " pod="openstack/dnsmasq-dns-5c9776ccc5-84dp4" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.152081 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/96a27084-0580-4b64-9cde-906db9a6f231-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-84dp4\" (UID: \"96a27084-0580-4b64-9cde-906db9a6f231\") " pod="openstack/dnsmasq-dns-5c9776ccc5-84dp4" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.152218 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/96a27084-0580-4b64-9cde-906db9a6f231-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-84dp4\" (UID: \"96a27084-0580-4b64-9cde-906db9a6f231\") " pod="openstack/dnsmasq-dns-5c9776ccc5-84dp4" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.152281 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96a27084-0580-4b64-9cde-906db9a6f231-config\") pod \"dnsmasq-dns-5c9776ccc5-84dp4\" (UID: \"96a27084-0580-4b64-9cde-906db9a6f231\") " pod="openstack/dnsmasq-dns-5c9776ccc5-84dp4" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.152632 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/96a27084-0580-4b64-9cde-906db9a6f231-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-84dp4\" (UID: \"96a27084-0580-4b64-9cde-906db9a6f231\") " pod="openstack/dnsmasq-dns-5c9776ccc5-84dp4" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.172475 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22bq2\" (UniqueName: \"kubernetes.io/projected/96a27084-0580-4b64-9cde-906db9a6f231-kube-api-access-22bq2\") pod \"dnsmasq-dns-5c9776ccc5-84dp4\" (UID: \"96a27084-0580-4b64-9cde-906db9a6f231\") " pod="openstack/dnsmasq-dns-5c9776ccc5-84dp4" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.279005 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6f6d546c9b-wrks9"] Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.280410 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f6d546c9b-wrks9" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.282891 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.283224 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.283352 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-98xfd" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.286050 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.305835 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6f6d546c9b-wrks9"] Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.356170 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d112a61a-4828-4d29-b47d-ee894ca24784-config\") pod \"neutron-6f6d546c9b-wrks9\" (UID: \"d112a61a-4828-4d29-b47d-ee894ca24784\") " pod="openstack/neutron-6f6d546c9b-wrks9" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.356349 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d112a61a-4828-4d29-b47d-ee894ca24784-ovndb-tls-certs\") pod \"neutron-6f6d546c9b-wrks9\" (UID: \"d112a61a-4828-4d29-b47d-ee894ca24784\") " pod="openstack/neutron-6f6d546c9b-wrks9" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.356377 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d112a61a-4828-4d29-b47d-ee894ca24784-httpd-config\") pod \"neutron-6f6d546c9b-wrks9\" (UID: \"d112a61a-4828-4d29-b47d-ee894ca24784\") " pod="openstack/neutron-6f6d546c9b-wrks9" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.356400 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7q6g\" (UniqueName: \"kubernetes.io/projected/d112a61a-4828-4d29-b47d-ee894ca24784-kube-api-access-r7q6g\") pod \"neutron-6f6d546c9b-wrks9\" (UID: \"d112a61a-4828-4d29-b47d-ee894ca24784\") " pod="openstack/neutron-6f6d546c9b-wrks9" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.356546 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d112a61a-4828-4d29-b47d-ee894ca24784-combined-ca-bundle\") pod \"neutron-6f6d546c9b-wrks9\" (UID: \"d112a61a-4828-4d29-b47d-ee894ca24784\") " pod="openstack/neutron-6f6d546c9b-wrks9" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.365193 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-84dp4" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.458555 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d112a61a-4828-4d29-b47d-ee894ca24784-combined-ca-bundle\") pod \"neutron-6f6d546c9b-wrks9\" (UID: \"d112a61a-4828-4d29-b47d-ee894ca24784\") " pod="openstack/neutron-6f6d546c9b-wrks9" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.458692 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d112a61a-4828-4d29-b47d-ee894ca24784-config\") pod \"neutron-6f6d546c9b-wrks9\" (UID: \"d112a61a-4828-4d29-b47d-ee894ca24784\") " pod="openstack/neutron-6f6d546c9b-wrks9" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.458784 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d112a61a-4828-4d29-b47d-ee894ca24784-ovndb-tls-certs\") pod \"neutron-6f6d546c9b-wrks9\" (UID: \"d112a61a-4828-4d29-b47d-ee894ca24784\") " pod="openstack/neutron-6f6d546c9b-wrks9" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.458807 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d112a61a-4828-4d29-b47d-ee894ca24784-httpd-config\") pod \"neutron-6f6d546c9b-wrks9\" (UID: \"d112a61a-4828-4d29-b47d-ee894ca24784\") " pod="openstack/neutron-6f6d546c9b-wrks9" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.458844 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7q6g\" (UniqueName: \"kubernetes.io/projected/d112a61a-4828-4d29-b47d-ee894ca24784-kube-api-access-r7q6g\") pod \"neutron-6f6d546c9b-wrks9\" (UID: \"d112a61a-4828-4d29-b47d-ee894ca24784\") " pod="openstack/neutron-6f6d546c9b-wrks9" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.470492 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d112a61a-4828-4d29-b47d-ee894ca24784-config\") pod \"neutron-6f6d546c9b-wrks9\" (UID: \"d112a61a-4828-4d29-b47d-ee894ca24784\") " pod="openstack/neutron-6f6d546c9b-wrks9" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.477256 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d112a61a-4828-4d29-b47d-ee894ca24784-httpd-config\") pod \"neutron-6f6d546c9b-wrks9\" (UID: \"d112a61a-4828-4d29-b47d-ee894ca24784\") " pod="openstack/neutron-6f6d546c9b-wrks9" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.481217 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d112a61a-4828-4d29-b47d-ee894ca24784-ovndb-tls-certs\") pod \"neutron-6f6d546c9b-wrks9\" (UID: \"d112a61a-4828-4d29-b47d-ee894ca24784\") " pod="openstack/neutron-6f6d546c9b-wrks9" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.482944 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d112a61a-4828-4d29-b47d-ee894ca24784-combined-ca-bundle\") pod \"neutron-6f6d546c9b-wrks9\" (UID: \"d112a61a-4828-4d29-b47d-ee894ca24784\") " pod="openstack/neutron-6f6d546c9b-wrks9" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.503134 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7q6g\" (UniqueName: \"kubernetes.io/projected/d112a61a-4828-4d29-b47d-ee894ca24784-kube-api-access-r7q6g\") pod \"neutron-6f6d546c9b-wrks9\" (UID: \"d112a61a-4828-4d29-b47d-ee894ca24784\") " pod="openstack/neutron-6f6d546c9b-wrks9" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.568859 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-bd6j2" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.600935 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-795f4db4bc-w5624" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.665043 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmdpg\" (UniqueName: \"kubernetes.io/projected/bd2db7ab-0fd2-4a42-9096-518d476e670f-kube-api-access-tmdpg\") pod \"bd2db7ab-0fd2-4a42-9096-518d476e670f\" (UID: \"bd2db7ab-0fd2-4a42-9096-518d476e670f\") " Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.665134 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bd2db7ab-0fd2-4a42-9096-518d476e670f-dns-swift-storage-0\") pod \"bd2db7ab-0fd2-4a42-9096-518d476e670f\" (UID: \"bd2db7ab-0fd2-4a42-9096-518d476e670f\") " Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.665159 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bd2db7ab-0fd2-4a42-9096-518d476e670f-ovsdbserver-nb\") pod \"bd2db7ab-0fd2-4a42-9096-518d476e670f\" (UID: \"bd2db7ab-0fd2-4a42-9096-518d476e670f\") " Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.665185 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/19251598-5cdb-4e4f-9eb7-05cd21d988fb-config-data-merged\") pod \"19251598-5cdb-4e4f-9eb7-05cd21d988fb\" (UID: \"19251598-5cdb-4e4f-9eb7-05cd21d988fb\") " Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.665213 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19251598-5cdb-4e4f-9eb7-05cd21d988fb-scripts\") pod \"19251598-5cdb-4e4f-9eb7-05cd21d988fb\" (UID: \"19251598-5cdb-4e4f-9eb7-05cd21d988fb\") " Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.665252 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd2db7ab-0fd2-4a42-9096-518d476e670f-config\") pod \"bd2db7ab-0fd2-4a42-9096-518d476e670f\" (UID: \"bd2db7ab-0fd2-4a42-9096-518d476e670f\") " Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.665272 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19251598-5cdb-4e4f-9eb7-05cd21d988fb-config-data\") pod \"19251598-5cdb-4e4f-9eb7-05cd21d988fb\" (UID: \"19251598-5cdb-4e4f-9eb7-05cd21d988fb\") " Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.665292 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bd2db7ab-0fd2-4a42-9096-518d476e670f-ovsdbserver-sb\") pod \"bd2db7ab-0fd2-4a42-9096-518d476e670f\" (UID: \"bd2db7ab-0fd2-4a42-9096-518d476e670f\") " Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.665394 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/19251598-5cdb-4e4f-9eb7-05cd21d988fb-etc-podinfo\") pod \"19251598-5cdb-4e4f-9eb7-05cd21d988fb\" (UID: \"19251598-5cdb-4e4f-9eb7-05cd21d988fb\") " Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.665422 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd2db7ab-0fd2-4a42-9096-518d476e670f-dns-svc\") pod \"bd2db7ab-0fd2-4a42-9096-518d476e670f\" (UID: \"bd2db7ab-0fd2-4a42-9096-518d476e670f\") " Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.665456 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g72v9\" (UniqueName: \"kubernetes.io/projected/19251598-5cdb-4e4f-9eb7-05cd21d988fb-kube-api-access-g72v9\") pod \"19251598-5cdb-4e4f-9eb7-05cd21d988fb\" (UID: \"19251598-5cdb-4e4f-9eb7-05cd21d988fb\") " Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.665485 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19251598-5cdb-4e4f-9eb7-05cd21d988fb-combined-ca-bundle\") pod \"19251598-5cdb-4e4f-9eb7-05cd21d988fb\" (UID: \"19251598-5cdb-4e4f-9eb7-05cd21d988fb\") " Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.692184 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f6d546c9b-wrks9" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.706535 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19251598-5cdb-4e4f-9eb7-05cd21d988fb-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "19251598-5cdb-4e4f-9eb7-05cd21d988fb" (UID: "19251598-5cdb-4e4f-9eb7-05cd21d988fb"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.715744 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd2db7ab-0fd2-4a42-9096-518d476e670f-kube-api-access-tmdpg" (OuterVolumeSpecName: "kube-api-access-tmdpg") pod "bd2db7ab-0fd2-4a42-9096-518d476e670f" (UID: "bd2db7ab-0fd2-4a42-9096-518d476e670f"). InnerVolumeSpecName "kube-api-access-tmdpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.746929 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/19251598-5cdb-4e4f-9eb7-05cd21d988fb-etc-podinfo" (OuterVolumeSpecName: "etc-podinfo") pod "19251598-5cdb-4e4f-9eb7-05cd21d988fb" (UID: "19251598-5cdb-4e4f-9eb7-05cd21d988fb"). InnerVolumeSpecName "etc-podinfo". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.759679 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19251598-5cdb-4e4f-9eb7-05cd21d988fb-kube-api-access-g72v9" (OuterVolumeSpecName: "kube-api-access-g72v9") pod "19251598-5cdb-4e4f-9eb7-05cd21d988fb" (UID: "19251598-5cdb-4e4f-9eb7-05cd21d988fb"). InnerVolumeSpecName "kube-api-access-g72v9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.774605 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19251598-5cdb-4e4f-9eb7-05cd21d988fb-scripts" (OuterVolumeSpecName: "scripts") pod "19251598-5cdb-4e4f-9eb7-05cd21d988fb" (UID: "19251598-5cdb-4e4f-9eb7-05cd21d988fb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.775633 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19251598-5cdb-4e4f-9eb7-05cd21d988fb-scripts\") pod \"19251598-5cdb-4e4f-9eb7-05cd21d988fb\" (UID: \"19251598-5cdb-4e4f-9eb7-05cd21d988fb\") " Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.776512 4699 reconciler_common.go:293] "Volume detached for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/19251598-5cdb-4e4f-9eb7-05cd21d988fb-etc-podinfo\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.776527 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g72v9\" (UniqueName: \"kubernetes.io/projected/19251598-5cdb-4e4f-9eb7-05cd21d988fb-kube-api-access-g72v9\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.776538 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmdpg\" (UniqueName: \"kubernetes.io/projected/bd2db7ab-0fd2-4a42-9096-518d476e670f-kube-api-access-tmdpg\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.776549 4699 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/19251598-5cdb-4e4f-9eb7-05cd21d988fb-config-data-merged\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:14 crc kubenswrapper[4699]: W1122 04:28:14.776705 4699 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/19251598-5cdb-4e4f-9eb7-05cd21d988fb/volumes/kubernetes.io~secret/scripts Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.776716 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19251598-5cdb-4e4f-9eb7-05cd21d988fb-scripts" (OuterVolumeSpecName: "scripts") pod "19251598-5cdb-4e4f-9eb7-05cd21d988fb" (UID: "19251598-5cdb-4e4f-9eb7-05cd21d988fb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.877858 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19251598-5cdb-4e4f-9eb7-05cd21d988fb-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.890839 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-bd6j2" event={"ID":"19251598-5cdb-4e4f-9eb7-05cd21d988fb","Type":"ContainerDied","Data":"bd6551f6180a26a2eee5977aa4ad4db45b42b1567ba302716c93ba3944bb0179"} Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.890889 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd6551f6180a26a2eee5977aa4ad4db45b42b1567ba302716c93ba3944bb0179" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.890980 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-bd6j2" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.923380 4699 generic.go:334] "Generic (PLEG): container finished" podID="bd2db7ab-0fd2-4a42-9096-518d476e670f" containerID="2b0f51586de1eb03dd61de946ae85df2280cb312373efdfe5dbccf61cccc4303" exitCode=0 Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.923624 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-795f4db4bc-w5624" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.923644 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-795f4db4bc-w5624" event={"ID":"bd2db7ab-0fd2-4a42-9096-518d476e670f","Type":"ContainerDied","Data":"2b0f51586de1eb03dd61de946ae85df2280cb312373efdfe5dbccf61cccc4303"} Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.924799 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-795f4db4bc-w5624" event={"ID":"bd2db7ab-0fd2-4a42-9096-518d476e670f","Type":"ContainerDied","Data":"8eee31fcb27a82b6921428478a3181bf1ab50af18f7fa7cb87ee42e08d359db5"} Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.924817 4699 scope.go:117] "RemoveContainer" containerID="2b0f51586de1eb03dd61de946ae85df2280cb312373efdfe5dbccf61cccc4303" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.932306 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19251598-5cdb-4e4f-9eb7-05cd21d988fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "19251598-5cdb-4e4f-9eb7-05cd21d988fb" (UID: "19251598-5cdb-4e4f-9eb7-05cd21d988fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.951395 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19251598-5cdb-4e4f-9eb7-05cd21d988fb-config-data" (OuterVolumeSpecName: "config-data") pod "19251598-5cdb-4e4f-9eb7-05cd21d988fb" (UID: "19251598-5cdb-4e4f-9eb7-05cd21d988fb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.965499 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd2db7ab-0fd2-4a42-9096-518d476e670f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bd2db7ab-0fd2-4a42-9096-518d476e670f" (UID: "bd2db7ab-0fd2-4a42-9096-518d476e670f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.974027 4699 scope.go:117] "RemoveContainer" containerID="bd4406f4634f9ed12a3623a5fad6e408653739b1a14d85673451ec2777bcb333" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.980710 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19251598-5cdb-4e4f-9eb7-05cd21d988fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.980736 4699 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bd2db7ab-0fd2-4a42-9096-518d476e670f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:14 crc kubenswrapper[4699]: I1122 04:28:14.980745 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19251598-5cdb-4e4f-9eb7-05cd21d988fb-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.003335 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd2db7ab-0fd2-4a42-9096-518d476e670f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bd2db7ab-0fd2-4a42-9096-518d476e670f" (UID: "bd2db7ab-0fd2-4a42-9096-518d476e670f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.011144 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd2db7ab-0fd2-4a42-9096-518d476e670f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bd2db7ab-0fd2-4a42-9096-518d476e670f" (UID: "bd2db7ab-0fd2-4a42-9096-518d476e670f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.026653 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd2db7ab-0fd2-4a42-9096-518d476e670f-config" (OuterVolumeSpecName: "config") pod "bd2db7ab-0fd2-4a42-9096-518d476e670f" (UID: "bd2db7ab-0fd2-4a42-9096-518d476e670f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.034277 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-84dp4"] Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.043496 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd2db7ab-0fd2-4a42-9096-518d476e670f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bd2db7ab-0fd2-4a42-9096-518d476e670f" (UID: "bd2db7ab-0fd2-4a42-9096-518d476e670f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.085113 4699 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bd2db7ab-0fd2-4a42-9096-518d476e670f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.085141 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd2db7ab-0fd2-4a42-9096-518d476e670f-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.085150 4699 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bd2db7ab-0fd2-4a42-9096-518d476e670f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.085158 4699 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd2db7ab-0fd2-4a42-9096-518d476e670f-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.165416 4699 scope.go:117] "RemoveContainer" containerID="2b0f51586de1eb03dd61de946ae85df2280cb312373efdfe5dbccf61cccc4303" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.171866 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-neutron-agent-65957c9c4f-4rj2b"] Nov 22 04:28:15 crc kubenswrapper[4699]: E1122 04:28:15.172247 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd2db7ab-0fd2-4a42-9096-518d476e670f" containerName="dnsmasq-dns" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.172258 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd2db7ab-0fd2-4a42-9096-518d476e670f" containerName="dnsmasq-dns" Nov 22 04:28:15 crc kubenswrapper[4699]: E1122 04:28:15.172281 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19251598-5cdb-4e4f-9eb7-05cd21d988fb" containerName="init" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.172288 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="19251598-5cdb-4e4f-9eb7-05cd21d988fb" containerName="init" Nov 22 04:28:15 crc kubenswrapper[4699]: E1122 04:28:15.172305 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19251598-5cdb-4e4f-9eb7-05cd21d988fb" containerName="ironic-db-sync" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.172312 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="19251598-5cdb-4e4f-9eb7-05cd21d988fb" containerName="ironic-db-sync" Nov 22 04:28:15 crc kubenswrapper[4699]: E1122 04:28:15.172326 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd2db7ab-0fd2-4a42-9096-518d476e670f" containerName="init" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.172334 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd2db7ab-0fd2-4a42-9096-518d476e670f" containerName="init" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.172834 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="19251598-5cdb-4e4f-9eb7-05cd21d988fb" containerName="ironic-db-sync" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.172857 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd2db7ab-0fd2-4a42-9096-518d476e670f" containerName="dnsmasq-dns" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.173571 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-neutron-agent-65957c9c4f-4rj2b" Nov 22 04:28:15 crc kubenswrapper[4699]: E1122 04:28:15.174952 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b0f51586de1eb03dd61de946ae85df2280cb312373efdfe5dbccf61cccc4303\": container with ID starting with 2b0f51586de1eb03dd61de946ae85df2280cb312373efdfe5dbccf61cccc4303 not found: ID does not exist" containerID="2b0f51586de1eb03dd61de946ae85df2280cb312373efdfe5dbccf61cccc4303" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.174995 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b0f51586de1eb03dd61de946ae85df2280cb312373efdfe5dbccf61cccc4303"} err="failed to get container status \"2b0f51586de1eb03dd61de946ae85df2280cb312373efdfe5dbccf61cccc4303\": rpc error: code = NotFound desc = could not find container \"2b0f51586de1eb03dd61de946ae85df2280cb312373efdfe5dbccf61cccc4303\": container with ID starting with 2b0f51586de1eb03dd61de946ae85df2280cb312373efdfe5dbccf61cccc4303 not found: ID does not exist" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.175023 4699 scope.go:117] "RemoveContainer" containerID="bd4406f4634f9ed12a3623a5fad6e408653739b1a14d85673451ec2777bcb333" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.181470 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-ironic-neutron-agent-config-data" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.188487 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-db-create-c94bq"] Nov 22 04:28:15 crc kubenswrapper[4699]: E1122 04:28:15.189605 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd4406f4634f9ed12a3623a5fad6e408653739b1a14d85673451ec2777bcb333\": container with ID starting with bd4406f4634f9ed12a3623a5fad6e408653739b1a14d85673451ec2777bcb333 not found: ID does not exist" containerID="bd4406f4634f9ed12a3623a5fad6e408653739b1a14d85673451ec2777bcb333" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.189666 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd4406f4634f9ed12a3623a5fad6e408653739b1a14d85673451ec2777bcb333"} err="failed to get container status \"bd4406f4634f9ed12a3623a5fad6e408653739b1a14d85673451ec2777bcb333\": rpc error: code = NotFound desc = could not find container \"bd4406f4634f9ed12a3623a5fad6e408653739b1a14d85673451ec2777bcb333\": container with ID starting with bd4406f4634f9ed12a3623a5fad6e408653739b1a14d85673451ec2777bcb333 not found: ID does not exist" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.189830 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-create-c94bq" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.226702 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-neutron-agent-65957c9c4f-4rj2b"] Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.246301 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-db-create-c94bq"] Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.290812 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j58p2\" (UniqueName: \"kubernetes.io/projected/ad04e92a-8275-4123-8b21-384b2f56cc3b-kube-api-access-j58p2\") pod \"ironic-inspector-db-create-c94bq\" (UID: \"ad04e92a-8275-4123-8b21-384b2f56cc3b\") " pod="openstack/ironic-inspector-db-create-c94bq" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.290864 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/474af2c7-c72f-4420-94a9-4876e0dbd68e-config\") pod \"ironic-neutron-agent-65957c9c4f-4rj2b\" (UID: \"474af2c7-c72f-4420-94a9-4876e0dbd68e\") " pod="openstack/ironic-neutron-agent-65957c9c4f-4rj2b" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.290913 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad04e92a-8275-4123-8b21-384b2f56cc3b-operator-scripts\") pod \"ironic-inspector-db-create-c94bq\" (UID: \"ad04e92a-8275-4123-8b21-384b2f56cc3b\") " pod="openstack/ironic-inspector-db-create-c94bq" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.291004 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcltl\" (UniqueName: \"kubernetes.io/projected/474af2c7-c72f-4420-94a9-4876e0dbd68e-kube-api-access-xcltl\") pod \"ironic-neutron-agent-65957c9c4f-4rj2b\" (UID: \"474af2c7-c72f-4420-94a9-4876e0dbd68e\") " pod="openstack/ironic-neutron-agent-65957c9c4f-4rj2b" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.291101 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/474af2c7-c72f-4420-94a9-4876e0dbd68e-combined-ca-bundle\") pod \"ironic-neutron-agent-65957c9c4f-4rj2b\" (UID: \"474af2c7-c72f-4420-94a9-4876e0dbd68e\") " pod="openstack/ironic-neutron-agent-65957c9c4f-4rj2b" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.298501 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-b1b0-account-create-zfw5l"] Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.299788 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-b1b0-account-create-zfw5l" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.301645 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-b1b0-account-create-zfw5l"] Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.302197 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-db-secret" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.376750 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-794784977b-czv6j"] Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.380956 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-794784977b-czv6j" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.387008 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-config-data" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.387255 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-api-scripts" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.387288 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.387384 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-api-config-data" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.393570 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/da52be58-8760-4d0a-866a-9eb3b47b2e8b-etc-podinfo\") pod \"ironic-794784977b-czv6j\" (UID: \"da52be58-8760-4d0a-866a-9eb3b47b2e8b\") " pod="openstack/ironic-794784977b-czv6j" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.393650 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/474af2c7-c72f-4420-94a9-4876e0dbd68e-combined-ca-bundle\") pod \"ironic-neutron-agent-65957c9c4f-4rj2b\" (UID: \"474af2c7-c72f-4420-94a9-4876e0dbd68e\") " pod="openstack/ironic-neutron-agent-65957c9c4f-4rj2b" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.393689 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da52be58-8760-4d0a-866a-9eb3b47b2e8b-scripts\") pod \"ironic-794784977b-czv6j\" (UID: \"da52be58-8760-4d0a-866a-9eb3b47b2e8b\") " pod="openstack/ironic-794784977b-czv6j" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.393717 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cfcj\" (UniqueName: \"kubernetes.io/projected/0e1b87a4-8a2e-4d69-b940-39df820a2a61-kube-api-access-6cfcj\") pod \"ironic-inspector-b1b0-account-create-zfw5l\" (UID: \"0e1b87a4-8a2e-4d69-b940-39df820a2a61\") " pod="openstack/ironic-inspector-b1b0-account-create-zfw5l" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.393746 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scgt2\" (UniqueName: \"kubernetes.io/projected/da52be58-8760-4d0a-866a-9eb3b47b2e8b-kube-api-access-scgt2\") pod \"ironic-794784977b-czv6j\" (UID: \"da52be58-8760-4d0a-866a-9eb3b47b2e8b\") " pod="openstack/ironic-794784977b-czv6j" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.393768 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da52be58-8760-4d0a-866a-9eb3b47b2e8b-combined-ca-bundle\") pod \"ironic-794784977b-czv6j\" (UID: \"da52be58-8760-4d0a-866a-9eb3b47b2e8b\") " pod="openstack/ironic-794784977b-czv6j" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.393792 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da52be58-8760-4d0a-866a-9eb3b47b2e8b-config-data\") pod \"ironic-794784977b-czv6j\" (UID: \"da52be58-8760-4d0a-866a-9eb3b47b2e8b\") " pod="openstack/ironic-794784977b-czv6j" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.393828 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/da52be58-8760-4d0a-866a-9eb3b47b2e8b-config-data-merged\") pod \"ironic-794784977b-czv6j\" (UID: \"da52be58-8760-4d0a-866a-9eb3b47b2e8b\") " pod="openstack/ironic-794784977b-czv6j" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.393869 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j58p2\" (UniqueName: \"kubernetes.io/projected/ad04e92a-8275-4123-8b21-384b2f56cc3b-kube-api-access-j58p2\") pod \"ironic-inspector-db-create-c94bq\" (UID: \"ad04e92a-8275-4123-8b21-384b2f56cc3b\") " pod="openstack/ironic-inspector-db-create-c94bq" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.393900 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/474af2c7-c72f-4420-94a9-4876e0dbd68e-config\") pod \"ironic-neutron-agent-65957c9c4f-4rj2b\" (UID: \"474af2c7-c72f-4420-94a9-4876e0dbd68e\") " pod="openstack/ironic-neutron-agent-65957c9c4f-4rj2b" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.393923 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da52be58-8760-4d0a-866a-9eb3b47b2e8b-logs\") pod \"ironic-794784977b-czv6j\" (UID: \"da52be58-8760-4d0a-866a-9eb3b47b2e8b\") " pod="openstack/ironic-794784977b-czv6j" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.393956 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da52be58-8760-4d0a-866a-9eb3b47b2e8b-config-data-custom\") pod \"ironic-794784977b-czv6j\" (UID: \"da52be58-8760-4d0a-866a-9eb3b47b2e8b\") " pod="openstack/ironic-794784977b-czv6j" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.393979 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad04e92a-8275-4123-8b21-384b2f56cc3b-operator-scripts\") pod \"ironic-inspector-db-create-c94bq\" (UID: \"ad04e92a-8275-4123-8b21-384b2f56cc3b\") " pod="openstack/ironic-inspector-db-create-c94bq" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.395699 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcltl\" (UniqueName: \"kubernetes.io/projected/474af2c7-c72f-4420-94a9-4876e0dbd68e-kube-api-access-xcltl\") pod \"ironic-neutron-agent-65957c9c4f-4rj2b\" (UID: \"474af2c7-c72f-4420-94a9-4876e0dbd68e\") " pod="openstack/ironic-neutron-agent-65957c9c4f-4rj2b" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.395801 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e1b87a4-8a2e-4d69-b940-39df820a2a61-operator-scripts\") pod \"ironic-inspector-b1b0-account-create-zfw5l\" (UID: \"0e1b87a4-8a2e-4d69-b940-39df820a2a61\") " pod="openstack/ironic-inspector-b1b0-account-create-zfw5l" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.399010 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad04e92a-8275-4123-8b21-384b2f56cc3b-operator-scripts\") pod \"ironic-inspector-db-create-c94bq\" (UID: \"ad04e92a-8275-4123-8b21-384b2f56cc3b\") " pod="openstack/ironic-inspector-db-create-c94bq" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.406992 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-794784977b-czv6j"] Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.418881 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcltl\" (UniqueName: \"kubernetes.io/projected/474af2c7-c72f-4420-94a9-4876e0dbd68e-kube-api-access-xcltl\") pod \"ironic-neutron-agent-65957c9c4f-4rj2b\" (UID: \"474af2c7-c72f-4420-94a9-4876e0dbd68e\") " pod="openstack/ironic-neutron-agent-65957c9c4f-4rj2b" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.426866 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/474af2c7-c72f-4420-94a9-4876e0dbd68e-config\") pod \"ironic-neutron-agent-65957c9c4f-4rj2b\" (UID: \"474af2c7-c72f-4420-94a9-4876e0dbd68e\") " pod="openstack/ironic-neutron-agent-65957c9c4f-4rj2b" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.439375 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/474af2c7-c72f-4420-94a9-4876e0dbd68e-combined-ca-bundle\") pod \"ironic-neutron-agent-65957c9c4f-4rj2b\" (UID: \"474af2c7-c72f-4420-94a9-4876e0dbd68e\") " pod="openstack/ironic-neutron-agent-65957c9c4f-4rj2b" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.451056 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j58p2\" (UniqueName: \"kubernetes.io/projected/ad04e92a-8275-4123-8b21-384b2f56cc3b-kube-api-access-j58p2\") pod \"ironic-inspector-db-create-c94bq\" (UID: \"ad04e92a-8275-4123-8b21-384b2f56cc3b\") " pod="openstack/ironic-inspector-db-create-c94bq" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.492667 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-795f4db4bc-w5624"] Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.497301 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e1b87a4-8a2e-4d69-b940-39df820a2a61-operator-scripts\") pod \"ironic-inspector-b1b0-account-create-zfw5l\" (UID: \"0e1b87a4-8a2e-4d69-b940-39df820a2a61\") " pod="openstack/ironic-inspector-b1b0-account-create-zfw5l" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.497378 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/da52be58-8760-4d0a-866a-9eb3b47b2e8b-etc-podinfo\") pod \"ironic-794784977b-czv6j\" (UID: \"da52be58-8760-4d0a-866a-9eb3b47b2e8b\") " pod="openstack/ironic-794784977b-czv6j" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.497496 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da52be58-8760-4d0a-866a-9eb3b47b2e8b-scripts\") pod \"ironic-794784977b-czv6j\" (UID: \"da52be58-8760-4d0a-866a-9eb3b47b2e8b\") " pod="openstack/ironic-794784977b-czv6j" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.497527 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cfcj\" (UniqueName: \"kubernetes.io/projected/0e1b87a4-8a2e-4d69-b940-39df820a2a61-kube-api-access-6cfcj\") pod \"ironic-inspector-b1b0-account-create-zfw5l\" (UID: \"0e1b87a4-8a2e-4d69-b940-39df820a2a61\") " pod="openstack/ironic-inspector-b1b0-account-create-zfw5l" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.497555 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scgt2\" (UniqueName: \"kubernetes.io/projected/da52be58-8760-4d0a-866a-9eb3b47b2e8b-kube-api-access-scgt2\") pod \"ironic-794784977b-czv6j\" (UID: \"da52be58-8760-4d0a-866a-9eb3b47b2e8b\") " pod="openstack/ironic-794784977b-czv6j" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.497582 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da52be58-8760-4d0a-866a-9eb3b47b2e8b-combined-ca-bundle\") pod \"ironic-794784977b-czv6j\" (UID: \"da52be58-8760-4d0a-866a-9eb3b47b2e8b\") " pod="openstack/ironic-794784977b-czv6j" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.497607 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da52be58-8760-4d0a-866a-9eb3b47b2e8b-config-data\") pod \"ironic-794784977b-czv6j\" (UID: \"da52be58-8760-4d0a-866a-9eb3b47b2e8b\") " pod="openstack/ironic-794784977b-czv6j" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.497667 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/da52be58-8760-4d0a-866a-9eb3b47b2e8b-config-data-merged\") pod \"ironic-794784977b-czv6j\" (UID: \"da52be58-8760-4d0a-866a-9eb3b47b2e8b\") " pod="openstack/ironic-794784977b-czv6j" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.497720 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da52be58-8760-4d0a-866a-9eb3b47b2e8b-logs\") pod \"ironic-794784977b-czv6j\" (UID: \"da52be58-8760-4d0a-866a-9eb3b47b2e8b\") " pod="openstack/ironic-794784977b-czv6j" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.497757 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da52be58-8760-4d0a-866a-9eb3b47b2e8b-config-data-custom\") pod \"ironic-794784977b-czv6j\" (UID: \"da52be58-8760-4d0a-866a-9eb3b47b2e8b\") " pod="openstack/ironic-794784977b-czv6j" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.499811 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e1b87a4-8a2e-4d69-b940-39df820a2a61-operator-scripts\") pod \"ironic-inspector-b1b0-account-create-zfw5l\" (UID: \"0e1b87a4-8a2e-4d69-b940-39df820a2a61\") " pod="openstack/ironic-inspector-b1b0-account-create-zfw5l" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.503028 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da52be58-8760-4d0a-866a-9eb3b47b2e8b-logs\") pod \"ironic-794784977b-czv6j\" (UID: \"da52be58-8760-4d0a-866a-9eb3b47b2e8b\") " pod="openstack/ironic-794784977b-czv6j" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.503540 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/da52be58-8760-4d0a-866a-9eb3b47b2e8b-config-data-merged\") pod \"ironic-794784977b-czv6j\" (UID: \"da52be58-8760-4d0a-866a-9eb3b47b2e8b\") " pod="openstack/ironic-794784977b-czv6j" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.507936 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da52be58-8760-4d0a-866a-9eb3b47b2e8b-config-data\") pod \"ironic-794784977b-czv6j\" (UID: \"da52be58-8760-4d0a-866a-9eb3b47b2e8b\") " pod="openstack/ironic-794784977b-czv6j" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.511033 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da52be58-8760-4d0a-866a-9eb3b47b2e8b-config-data-custom\") pod \"ironic-794784977b-czv6j\" (UID: \"da52be58-8760-4d0a-866a-9eb3b47b2e8b\") " pod="openstack/ironic-794784977b-czv6j" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.511394 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/da52be58-8760-4d0a-866a-9eb3b47b2e8b-etc-podinfo\") pod \"ironic-794784977b-czv6j\" (UID: \"da52be58-8760-4d0a-866a-9eb3b47b2e8b\") " pod="openstack/ironic-794784977b-czv6j" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.515248 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-neutron-agent-65957c9c4f-4rj2b" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.523110 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da52be58-8760-4d0a-866a-9eb3b47b2e8b-combined-ca-bundle\") pod \"ironic-794784977b-czv6j\" (UID: \"da52be58-8760-4d0a-866a-9eb3b47b2e8b\") " pod="openstack/ironic-794784977b-czv6j" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.523648 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da52be58-8760-4d0a-866a-9eb3b47b2e8b-scripts\") pod \"ironic-794784977b-czv6j\" (UID: \"da52be58-8760-4d0a-866a-9eb3b47b2e8b\") " pod="openstack/ironic-794784977b-czv6j" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.530784 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cfcj\" (UniqueName: \"kubernetes.io/projected/0e1b87a4-8a2e-4d69-b940-39df820a2a61-kube-api-access-6cfcj\") pod \"ironic-inspector-b1b0-account-create-zfw5l\" (UID: \"0e1b87a4-8a2e-4d69-b940-39df820a2a61\") " pod="openstack/ironic-inspector-b1b0-account-create-zfw5l" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.531103 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-795f4db4bc-w5624"] Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.549159 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scgt2\" (UniqueName: \"kubernetes.io/projected/da52be58-8760-4d0a-866a-9eb3b47b2e8b-kube-api-access-scgt2\") pod \"ironic-794784977b-czv6j\" (UID: \"da52be58-8760-4d0a-866a-9eb3b47b2e8b\") " pod="openstack/ironic-794784977b-czv6j" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.556611 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-create-c94bq" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.579457 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-b1b0-account-create-zfw5l" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.595475 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-794784977b-czv6j" Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.651049 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6f6d546c9b-wrks9"] Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.956328 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f6d546c9b-wrks9" event={"ID":"d112a61a-4828-4d29-b47d-ee894ca24784","Type":"ContainerStarted","Data":"983884d9b2b360e2a9acf79e9de2759b9dad21849d254cb14e57da4bf5e190d0"} Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.981617 4699 generic.go:334] "Generic (PLEG): container finished" podID="96a27084-0580-4b64-9cde-906db9a6f231" containerID="05f6d2b47545b51eb320788f3230ef396c1f699ec82849b48f100eb8ac03cb5f" exitCode=0 Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.981677 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-84dp4" event={"ID":"96a27084-0580-4b64-9cde-906db9a6f231","Type":"ContainerDied","Data":"05f6d2b47545b51eb320788f3230ef396c1f699ec82849b48f100eb8ac03cb5f"} Nov 22 04:28:15 crc kubenswrapper[4699]: I1122 04:28:15.981712 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-84dp4" event={"ID":"96a27084-0580-4b64-9cde-906db9a6f231","Type":"ContainerStarted","Data":"17bb57c2e772c5534959caa05624ca29d250ad48563f05ebece5793bf6078cb3"} Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.008336 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-76d968d474-5l6z2" podUID="eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.156:9311/healthcheck\": read tcp 10.217.0.2:57526->10.217.0.156:9311: read: connection reset by peer" Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.008475 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-76d968d474-5l6z2" podUID="eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.156:9311/healthcheck\": read tcp 10.217.0.2:57530->10.217.0.156:9311: read: connection reset by peer" Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.186135 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-neutron-agent-65957c9c4f-4rj2b"] Nov 22 04:28:16 crc kubenswrapper[4699]: W1122 04:28:16.328695 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad04e92a_8275_4123_8b21_384b2f56cc3b.slice/crio-51d7eca3f531d91f56626d5b01ff2d4f49fcd11b44e07879db8dede42936525d WatchSource:0}: Error finding container 51d7eca3f531d91f56626d5b01ff2d4f49fcd11b44e07879db8dede42936525d: Status 404 returned error can't find the container with id 51d7eca3f531d91f56626d5b01ff2d4f49fcd11b44e07879db8dede42936525d Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.334316 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-db-create-c94bq"] Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.345635 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-794784977b-czv6j"] Nov 22 04:28:16 crc kubenswrapper[4699]: W1122 04:28:16.368821 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda52be58_8760_4d0a_866a_9eb3b47b2e8b.slice/crio-a32075ab60ca7e8d40f944859afbe3afb51f7afae7ff399b48dd5462e1558611 WatchSource:0}: Error finding container a32075ab60ca7e8d40f944859afbe3afb51f7afae7ff399b48dd5462e1558611: Status 404 returned error can't find the container with id a32075ab60ca7e8d40f944859afbe3afb51f7afae7ff399b48dd5462e1558611 Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.386122 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.421127 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-conductor-0"] Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.424364 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-conductor-0" Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.431206 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-conductor-scripts" Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.431291 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-conductor-config-data" Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.461215 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-conductor-0"] Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.497367 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.539493 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-b1b0-account-create-zfw5l"] Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.551581 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/6b0a42c8-e8a1-45b3-9f29-77459d98ea4d-etc-podinfo\") pod \"ironic-conductor-0\" (UID: \"6b0a42c8-e8a1-45b3-9f29-77459d98ea4d\") " pod="openstack/ironic-conductor-0" Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.551636 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b0a42c8-e8a1-45b3-9f29-77459d98ea4d-combined-ca-bundle\") pod \"ironic-conductor-0\" (UID: \"6b0a42c8-e8a1-45b3-9f29-77459d98ea4d\") " pod="openstack/ironic-conductor-0" Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.551672 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b0a42c8-e8a1-45b3-9f29-77459d98ea4d-config-data\") pod \"ironic-conductor-0\" (UID: \"6b0a42c8-e8a1-45b3-9f29-77459d98ea4d\") " pod="openstack/ironic-conductor-0" Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.551688 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5nj9\" (UniqueName: \"kubernetes.io/projected/6b0a42c8-e8a1-45b3-9f29-77459d98ea4d-kube-api-access-d5nj9\") pod \"ironic-conductor-0\" (UID: \"6b0a42c8-e8a1-45b3-9f29-77459d98ea4d\") " pod="openstack/ironic-conductor-0" Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.551710 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ironic-conductor-0\" (UID: \"6b0a42c8-e8a1-45b3-9f29-77459d98ea4d\") " pod="openstack/ironic-conductor-0" Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.552010 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b0a42c8-e8a1-45b3-9f29-77459d98ea4d-scripts\") pod \"ironic-conductor-0\" (UID: \"6b0a42c8-e8a1-45b3-9f29-77459d98ea4d\") " pod="openstack/ironic-conductor-0" Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.552116 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b0a42c8-e8a1-45b3-9f29-77459d98ea4d-config-data-custom\") pod \"ironic-conductor-0\" (UID: \"6b0a42c8-e8a1-45b3-9f29-77459d98ea4d\") " pod="openstack/ironic-conductor-0" Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.552160 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/6b0a42c8-e8a1-45b3-9f29-77459d98ea4d-config-data-merged\") pod \"ironic-conductor-0\" (UID: \"6b0a42c8-e8a1-45b3-9f29-77459d98ea4d\") " pod="openstack/ironic-conductor-0" Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.655368 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b0a42c8-e8a1-45b3-9f29-77459d98ea4d-scripts\") pod \"ironic-conductor-0\" (UID: \"6b0a42c8-e8a1-45b3-9f29-77459d98ea4d\") " pod="openstack/ironic-conductor-0" Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.656385 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b0a42c8-e8a1-45b3-9f29-77459d98ea4d-config-data-custom\") pod \"ironic-conductor-0\" (UID: \"6b0a42c8-e8a1-45b3-9f29-77459d98ea4d\") " pod="openstack/ironic-conductor-0" Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.656460 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/6b0a42c8-e8a1-45b3-9f29-77459d98ea4d-config-data-merged\") pod \"ironic-conductor-0\" (UID: \"6b0a42c8-e8a1-45b3-9f29-77459d98ea4d\") " pod="openstack/ironic-conductor-0" Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.656562 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/6b0a42c8-e8a1-45b3-9f29-77459d98ea4d-etc-podinfo\") pod \"ironic-conductor-0\" (UID: \"6b0a42c8-e8a1-45b3-9f29-77459d98ea4d\") " pod="openstack/ironic-conductor-0" Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.656588 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b0a42c8-e8a1-45b3-9f29-77459d98ea4d-combined-ca-bundle\") pod \"ironic-conductor-0\" (UID: \"6b0a42c8-e8a1-45b3-9f29-77459d98ea4d\") " pod="openstack/ironic-conductor-0" Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.656611 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b0a42c8-e8a1-45b3-9f29-77459d98ea4d-config-data\") pod \"ironic-conductor-0\" (UID: \"6b0a42c8-e8a1-45b3-9f29-77459d98ea4d\") " pod="openstack/ironic-conductor-0" Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.656653 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5nj9\" (UniqueName: \"kubernetes.io/projected/6b0a42c8-e8a1-45b3-9f29-77459d98ea4d-kube-api-access-d5nj9\") pod \"ironic-conductor-0\" (UID: \"6b0a42c8-e8a1-45b3-9f29-77459d98ea4d\") " pod="openstack/ironic-conductor-0" Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.656675 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ironic-conductor-0\" (UID: \"6b0a42c8-e8a1-45b3-9f29-77459d98ea4d\") " pod="openstack/ironic-conductor-0" Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.660618 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/6b0a42c8-e8a1-45b3-9f29-77459d98ea4d-config-data-merged\") pod \"ironic-conductor-0\" (UID: \"6b0a42c8-e8a1-45b3-9f29-77459d98ea4d\") " pod="openstack/ironic-conductor-0" Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.661699 4699 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ironic-conductor-0\" (UID: \"6b0a42c8-e8a1-45b3-9f29-77459d98ea4d\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ironic-conductor-0" Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.662913 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b0a42c8-e8a1-45b3-9f29-77459d98ea4d-scripts\") pod \"ironic-conductor-0\" (UID: \"6b0a42c8-e8a1-45b3-9f29-77459d98ea4d\") " pod="openstack/ironic-conductor-0" Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.666184 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b0a42c8-e8a1-45b3-9f29-77459d98ea4d-config-data-custom\") pod \"ironic-conductor-0\" (UID: \"6b0a42c8-e8a1-45b3-9f29-77459d98ea4d\") " pod="openstack/ironic-conductor-0" Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.669140 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b0a42c8-e8a1-45b3-9f29-77459d98ea4d-combined-ca-bundle\") pod \"ironic-conductor-0\" (UID: \"6b0a42c8-e8a1-45b3-9f29-77459d98ea4d\") " pod="openstack/ironic-conductor-0" Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.672627 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/6b0a42c8-e8a1-45b3-9f29-77459d98ea4d-etc-podinfo\") pod \"ironic-conductor-0\" (UID: \"6b0a42c8-e8a1-45b3-9f29-77459d98ea4d\") " pod="openstack/ironic-conductor-0" Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.672908 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b0a42c8-e8a1-45b3-9f29-77459d98ea4d-config-data\") pod \"ironic-conductor-0\" (UID: \"6b0a42c8-e8a1-45b3-9f29-77459d98ea4d\") " pod="openstack/ironic-conductor-0" Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.676702 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-76d968d474-5l6z2" Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.681365 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5nj9\" (UniqueName: \"kubernetes.io/projected/6b0a42c8-e8a1-45b3-9f29-77459d98ea4d-kube-api-access-d5nj9\") pod \"ironic-conductor-0\" (UID: \"6b0a42c8-e8a1-45b3-9f29-77459d98ea4d\") " pod="openstack/ironic-conductor-0" Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.737275 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ironic-conductor-0\" (UID: \"6b0a42c8-e8a1-45b3-9f29-77459d98ea4d\") " pod="openstack/ironic-conductor-0" Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.818699 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-conductor-0" Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.859089 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe-config-data-custom\") pod \"eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe\" (UID: \"eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe\") " Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.859211 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe-combined-ca-bundle\") pod \"eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe\" (UID: \"eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe\") " Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.859320 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe-logs\") pod \"eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe\" (UID: \"eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe\") " Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.859360 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hd57\" (UniqueName: \"kubernetes.io/projected/eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe-kube-api-access-5hd57\") pod \"eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe\" (UID: \"eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe\") " Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.859478 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe-config-data\") pod \"eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe\" (UID: \"eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe\") " Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.860595 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe-logs" (OuterVolumeSpecName: "logs") pod "eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe" (UID: "eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.866758 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe" (UID: "eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.896650 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe-kube-api-access-5hd57" (OuterVolumeSpecName: "kube-api-access-5hd57") pod "eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe" (UID: "eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe"). InnerVolumeSpecName "kube-api-access-5hd57". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.906857 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-566cbdbc45-ld9jb"] Nov 22 04:28:16 crc kubenswrapper[4699]: E1122 04:28:16.907330 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe" containerName="barbican-api-log" Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.907350 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe" containerName="barbican-api-log" Nov 22 04:28:16 crc kubenswrapper[4699]: E1122 04:28:16.907376 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe" containerName="barbican-api" Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.907385 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe" containerName="barbican-api" Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.907645 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe" containerName="barbican-api-log" Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.907681 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe" containerName="barbican-api" Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.908910 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-566cbdbc45-ld9jb" Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.909944 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe" (UID: "eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.914328 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.915422 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.934796 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-566cbdbc45-ld9jb"] Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.964757 4699 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.964792 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.964801 4699 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe-logs\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.964810 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hd57\" (UniqueName: \"kubernetes.io/projected/eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe-kube-api-access-5hd57\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.995534 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-65957c9c4f-4rj2b" event={"ID":"474af2c7-c72f-4420-94a9-4876e0dbd68e","Type":"ContainerStarted","Data":"ac4edccebadcfc5938284e5449b06ff7ea589a49415176c7223a9ee42e3b39e2"} Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.996464 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-create-c94bq" event={"ID":"ad04e92a-8275-4123-8b21-384b2f56cc3b","Type":"ContainerStarted","Data":"d3d3fe9c4e2a1ce9c52ff6646ef81226fd80658671f070e8c6123e626e9a0221"} Nov 22 04:28:16 crc kubenswrapper[4699]: I1122 04:28:16.996482 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-create-c94bq" event={"ID":"ad04e92a-8275-4123-8b21-384b2f56cc3b","Type":"ContainerStarted","Data":"51d7eca3f531d91f56626d5b01ff2d4f49fcd11b44e07879db8dede42936525d"} Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:16.998336 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-794784977b-czv6j" event={"ID":"da52be58-8760-4d0a-866a-9eb3b47b2e8b","Type":"ContainerStarted","Data":"a32075ab60ca7e8d40f944859afbe3afb51f7afae7ff399b48dd5462e1558611"} Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:16.999193 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-b1b0-account-create-zfw5l" event={"ID":"0e1b87a4-8a2e-4d69-b940-39df820a2a61","Type":"ContainerStarted","Data":"b0a2c92d4114c57687fddfd1b595882f3246df62ba032fead209de790d82718d"} Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:16.999211 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-b1b0-account-create-zfw5l" event={"ID":"0e1b87a4-8a2e-4d69-b940-39df820a2a61","Type":"ContainerStarted","Data":"1dd5decc54e1600a44a30c94303bbce13dc1d3905e24724e834329270d8d525f"} Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.012566 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe-config-data" (OuterVolumeSpecName: "config-data") pod "eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe" (UID: "eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.049136 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-inspector-db-create-c94bq" podStartSLOduration=2.049120613 podStartE2EDuration="2.049120613s" podCreationTimestamp="2025-11-22 04:28:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:28:17.036194909 +0000 UTC m=+1248.378816096" watchObservedRunningTime="2025-11-22 04:28:17.049120613 +0000 UTC m=+1248.391741800" Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.058007 4699 generic.go:334] "Generic (PLEG): container finished" podID="eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe" containerID="eefbea6d5fb036522c9edff0ec95f12837269e0a104e6d5bb2dec3833687276a" exitCode=0 Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.058124 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76d968d474-5l6z2" event={"ID":"eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe","Type":"ContainerDied","Data":"eefbea6d5fb036522c9edff0ec95f12837269e0a104e6d5bb2dec3833687276a"} Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.058154 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76d968d474-5l6z2" event={"ID":"eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe","Type":"ContainerDied","Data":"e53b82e4036a59e3fdc23e4c806462741d6eb6494984e66b259c865fdaa7a7d8"} Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.058172 4699 scope.go:117] "RemoveContainer" containerID="eefbea6d5fb036522c9edff0ec95f12837269e0a104e6d5bb2dec3833687276a" Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.058341 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-76d968d474-5l6z2" Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.069184 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-inspector-b1b0-account-create-zfw5l" podStartSLOduration=2.069156939 podStartE2EDuration="2.069156939s" podCreationTimestamp="2025-11-22 04:28:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:28:17.068487502 +0000 UTC m=+1248.411108699" watchObservedRunningTime="2025-11-22 04:28:17.069156939 +0000 UTC m=+1248.411778126" Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.075056 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c5bbb47-8099-4bbb-b8a0-d2a56265522b-internal-tls-certs\") pod \"neutron-566cbdbc45-ld9jb\" (UID: \"4c5bbb47-8099-4bbb-b8a0-d2a56265522b\") " pod="openstack/neutron-566cbdbc45-ld9jb" Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.075226 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4c5bbb47-8099-4bbb-b8a0-d2a56265522b-httpd-config\") pod \"neutron-566cbdbc45-ld9jb\" (UID: \"4c5bbb47-8099-4bbb-b8a0-d2a56265522b\") " pod="openstack/neutron-566cbdbc45-ld9jb" Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.075295 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4c5bbb47-8099-4bbb-b8a0-d2a56265522b-config\") pod \"neutron-566cbdbc45-ld9jb\" (UID: \"4c5bbb47-8099-4bbb-b8a0-d2a56265522b\") " pod="openstack/neutron-566cbdbc45-ld9jb" Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.075367 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c5bbb47-8099-4bbb-b8a0-d2a56265522b-combined-ca-bundle\") pod \"neutron-566cbdbc45-ld9jb\" (UID: \"4c5bbb47-8099-4bbb-b8a0-d2a56265522b\") " pod="openstack/neutron-566cbdbc45-ld9jb" Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.075402 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c5bbb47-8099-4bbb-b8a0-d2a56265522b-public-tls-certs\") pod \"neutron-566cbdbc45-ld9jb\" (UID: \"4c5bbb47-8099-4bbb-b8a0-d2a56265522b\") " pod="openstack/neutron-566cbdbc45-ld9jb" Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.075709 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpk9l\" (UniqueName: \"kubernetes.io/projected/4c5bbb47-8099-4bbb-b8a0-d2a56265522b-kube-api-access-xpk9l\") pod \"neutron-566cbdbc45-ld9jb\" (UID: \"4c5bbb47-8099-4bbb-b8a0-d2a56265522b\") " pod="openstack/neutron-566cbdbc45-ld9jb" Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.075850 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c5bbb47-8099-4bbb-b8a0-d2a56265522b-ovndb-tls-certs\") pod \"neutron-566cbdbc45-ld9jb\" (UID: \"4c5bbb47-8099-4bbb-b8a0-d2a56265522b\") " pod="openstack/neutron-566cbdbc45-ld9jb" Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.075955 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.076824 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f6d546c9b-wrks9" event={"ID":"d112a61a-4828-4d29-b47d-ee894ca24784","Type":"ContainerStarted","Data":"93774c3e979cfc6002da894382cdd76ae9a2684d25c31d0ab44f93df7f619464"} Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.076855 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f6d546c9b-wrks9" event={"ID":"d112a61a-4828-4d29-b47d-ee894ca24784","Type":"ContainerStarted","Data":"7e1fcbb58973272b444fef351419c088db0c22f36b8210be107197b7f9bc8eaa"} Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.077358 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6f6d546c9b-wrks9" Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.079875 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-84dp4" event={"ID":"96a27084-0580-4b64-9cde-906db9a6f231","Type":"ContainerStarted","Data":"6b465583892dd38329a475f91a9adefc3fae48c4f64ddf86e6c7c142ed04623b"} Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.079983 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="5396825b-8417-449a-90cb-c0755b9d83a4" containerName="cinder-scheduler" containerID="cri-o://c6063467628fb4837a805b00a933593595ff173bc8d1851e969145acc47224fe" gracePeriod=30 Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.080067 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="5396825b-8417-449a-90cb-c0755b9d83a4" containerName="probe" containerID="cri-o://eaf34eade050fc67216bfc310dd009497306c3245680da6401a1624a02697d32" gracePeriod=30 Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.081601 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-84dp4" Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.101596 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6f6d546c9b-wrks9" podStartSLOduration=3.101578705 podStartE2EDuration="3.101578705s" podCreationTimestamp="2025-11-22 04:28:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:28:17.090386933 +0000 UTC m=+1248.433008110" watchObservedRunningTime="2025-11-22 04:28:17.101578705 +0000 UTC m=+1248.444199892" Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.125600 4699 scope.go:117] "RemoveContainer" containerID="8382c65c1883c39f5a96b59663fbbf89ac9a4d17fa18641974495ad18a2a8ced" Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.180696 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c5bbb47-8099-4bbb-b8a0-d2a56265522b-combined-ca-bundle\") pod \"neutron-566cbdbc45-ld9jb\" (UID: \"4c5bbb47-8099-4bbb-b8a0-d2a56265522b\") " pod="openstack/neutron-566cbdbc45-ld9jb" Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.180750 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c5bbb47-8099-4bbb-b8a0-d2a56265522b-public-tls-certs\") pod \"neutron-566cbdbc45-ld9jb\" (UID: \"4c5bbb47-8099-4bbb-b8a0-d2a56265522b\") " pod="openstack/neutron-566cbdbc45-ld9jb" Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.180940 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpk9l\" (UniqueName: \"kubernetes.io/projected/4c5bbb47-8099-4bbb-b8a0-d2a56265522b-kube-api-access-xpk9l\") pod \"neutron-566cbdbc45-ld9jb\" (UID: \"4c5bbb47-8099-4bbb-b8a0-d2a56265522b\") " pod="openstack/neutron-566cbdbc45-ld9jb" Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.181032 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c5bbb47-8099-4bbb-b8a0-d2a56265522b-ovndb-tls-certs\") pod \"neutron-566cbdbc45-ld9jb\" (UID: \"4c5bbb47-8099-4bbb-b8a0-d2a56265522b\") " pod="openstack/neutron-566cbdbc45-ld9jb" Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.181092 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c5bbb47-8099-4bbb-b8a0-d2a56265522b-internal-tls-certs\") pod \"neutron-566cbdbc45-ld9jb\" (UID: \"4c5bbb47-8099-4bbb-b8a0-d2a56265522b\") " pod="openstack/neutron-566cbdbc45-ld9jb" Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.181209 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4c5bbb47-8099-4bbb-b8a0-d2a56265522b-httpd-config\") pod \"neutron-566cbdbc45-ld9jb\" (UID: \"4c5bbb47-8099-4bbb-b8a0-d2a56265522b\") " pod="openstack/neutron-566cbdbc45-ld9jb" Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.181254 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4c5bbb47-8099-4bbb-b8a0-d2a56265522b-config\") pod \"neutron-566cbdbc45-ld9jb\" (UID: \"4c5bbb47-8099-4bbb-b8a0-d2a56265522b\") " pod="openstack/neutron-566cbdbc45-ld9jb" Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.189713 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c5bbb47-8099-4bbb-b8a0-d2a56265522b-public-tls-certs\") pod \"neutron-566cbdbc45-ld9jb\" (UID: \"4c5bbb47-8099-4bbb-b8a0-d2a56265522b\") " pod="openstack/neutron-566cbdbc45-ld9jb" Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.190637 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4c5bbb47-8099-4bbb-b8a0-d2a56265522b-config\") pod \"neutron-566cbdbc45-ld9jb\" (UID: \"4c5bbb47-8099-4bbb-b8a0-d2a56265522b\") " pod="openstack/neutron-566cbdbc45-ld9jb" Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.192351 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c5bbb47-8099-4bbb-b8a0-d2a56265522b-internal-tls-certs\") pod \"neutron-566cbdbc45-ld9jb\" (UID: \"4c5bbb47-8099-4bbb-b8a0-d2a56265522b\") " pod="openstack/neutron-566cbdbc45-ld9jb" Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.194143 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c5bbb47-8099-4bbb-b8a0-d2a56265522b-combined-ca-bundle\") pod \"neutron-566cbdbc45-ld9jb\" (UID: \"4c5bbb47-8099-4bbb-b8a0-d2a56265522b\") " pod="openstack/neutron-566cbdbc45-ld9jb" Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.194649 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4c5bbb47-8099-4bbb-b8a0-d2a56265522b-httpd-config\") pod \"neutron-566cbdbc45-ld9jb\" (UID: \"4c5bbb47-8099-4bbb-b8a0-d2a56265522b\") " pod="openstack/neutron-566cbdbc45-ld9jb" Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.203264 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c5bbb47-8099-4bbb-b8a0-d2a56265522b-ovndb-tls-certs\") pod \"neutron-566cbdbc45-ld9jb\" (UID: \"4c5bbb47-8099-4bbb-b8a0-d2a56265522b\") " pod="openstack/neutron-566cbdbc45-ld9jb" Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.208357 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpk9l\" (UniqueName: \"kubernetes.io/projected/4c5bbb47-8099-4bbb-b8a0-d2a56265522b-kube-api-access-xpk9l\") pod \"neutron-566cbdbc45-ld9jb\" (UID: \"4c5bbb47-8099-4bbb-b8a0-d2a56265522b\") " pod="openstack/neutron-566cbdbc45-ld9jb" Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.273368 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-566cbdbc45-ld9jb" Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.467375 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd2db7ab-0fd2-4a42-9096-518d476e670f" path="/var/lib/kubelet/pods/bd2db7ab-0fd2-4a42-9096-518d476e670f/volumes" Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.507378 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-84dp4" podStartSLOduration=4.507356073 podStartE2EDuration="4.507356073s" podCreationTimestamp="2025-11-22 04:28:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:28:17.120815381 +0000 UTC m=+1248.463436578" watchObservedRunningTime="2025-11-22 04:28:17.507356073 +0000 UTC m=+1248.849977260" Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.516342 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-conductor-0"] Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.561910 4699 scope.go:117] "RemoveContainer" containerID="eefbea6d5fb036522c9edff0ec95f12837269e0a104e6d5bb2dec3833687276a" Nov 22 04:28:17 crc kubenswrapper[4699]: E1122 04:28:17.564641 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eefbea6d5fb036522c9edff0ec95f12837269e0a104e6d5bb2dec3833687276a\": container with ID starting with eefbea6d5fb036522c9edff0ec95f12837269e0a104e6d5bb2dec3833687276a not found: ID does not exist" containerID="eefbea6d5fb036522c9edff0ec95f12837269e0a104e6d5bb2dec3833687276a" Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.564695 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eefbea6d5fb036522c9edff0ec95f12837269e0a104e6d5bb2dec3833687276a"} err="failed to get container status \"eefbea6d5fb036522c9edff0ec95f12837269e0a104e6d5bb2dec3833687276a\": rpc error: code = NotFound desc = could not find container \"eefbea6d5fb036522c9edff0ec95f12837269e0a104e6d5bb2dec3833687276a\": container with ID starting with eefbea6d5fb036522c9edff0ec95f12837269e0a104e6d5bb2dec3833687276a not found: ID does not exist" Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.564732 4699 scope.go:117] "RemoveContainer" containerID="8382c65c1883c39f5a96b59663fbbf89ac9a4d17fa18641974495ad18a2a8ced" Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.564837 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-76d968d474-5l6z2"] Nov 22 04:28:17 crc kubenswrapper[4699]: E1122 04:28:17.567588 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8382c65c1883c39f5a96b59663fbbf89ac9a4d17fa18641974495ad18a2a8ced\": container with ID starting with 8382c65c1883c39f5a96b59663fbbf89ac9a4d17fa18641974495ad18a2a8ced not found: ID does not exist" containerID="8382c65c1883c39f5a96b59663fbbf89ac9a4d17fa18641974495ad18a2a8ced" Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.567632 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8382c65c1883c39f5a96b59663fbbf89ac9a4d17fa18641974495ad18a2a8ced"} err="failed to get container status \"8382c65c1883c39f5a96b59663fbbf89ac9a4d17fa18641974495ad18a2a8ced\": rpc error: code = NotFound desc = could not find container \"8382c65c1883c39f5a96b59663fbbf89ac9a4d17fa18641974495ad18a2a8ced\": container with ID starting with 8382c65c1883c39f5a96b59663fbbf89ac9a4d17fa18641974495ad18a2a8ced not found: ID does not exist" Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.617329 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-76d968d474-5l6z2"] Nov 22 04:28:17 crc kubenswrapper[4699]: I1122 04:28:17.764914 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6bf6559788-s4hk6" Nov 22 04:28:18 crc kubenswrapper[4699]: I1122 04:28:18.010985 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-566cbdbc45-ld9jb"] Nov 22 04:28:18 crc kubenswrapper[4699]: W1122 04:28:18.054542 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c5bbb47_8099_4bbb_b8a0_d2a56265522b.slice/crio-513bd0e468966acdc09a2dcf3c1acc433f9ad7002da0556e46d7463a417f1e46 WatchSource:0}: Error finding container 513bd0e468966acdc09a2dcf3c1acc433f9ad7002da0556e46d7463a417f1e46: Status 404 returned error can't find the container with id 513bd0e468966acdc09a2dcf3c1acc433f9ad7002da0556e46d7463a417f1e46 Nov 22 04:28:18 crc kubenswrapper[4699]: I1122 04:28:18.110105 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-566cbdbc45-ld9jb" event={"ID":"4c5bbb47-8099-4bbb-b8a0-d2a56265522b","Type":"ContainerStarted","Data":"513bd0e468966acdc09a2dcf3c1acc433f9ad7002da0556e46d7463a417f1e46"} Nov 22 04:28:18 crc kubenswrapper[4699]: I1122 04:28:18.111930 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"6b0a42c8-e8a1-45b3-9f29-77459d98ea4d","Type":"ContainerStarted","Data":"1581830665fc403467a10cb56b6282a68f85f3c9bb99e27a554c9172b2fa45c3"} Nov 22 04:28:18 crc kubenswrapper[4699]: I1122 04:28:18.118736 4699 generic.go:334] "Generic (PLEG): container finished" podID="5396825b-8417-449a-90cb-c0755b9d83a4" containerID="eaf34eade050fc67216bfc310dd009497306c3245680da6401a1624a02697d32" exitCode=0 Nov 22 04:28:18 crc kubenswrapper[4699]: I1122 04:28:18.118815 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5396825b-8417-449a-90cb-c0755b9d83a4","Type":"ContainerDied","Data":"eaf34eade050fc67216bfc310dd009497306c3245680da6401a1624a02697d32"} Nov 22 04:28:18 crc kubenswrapper[4699]: I1122 04:28:18.122970 4699 generic.go:334] "Generic (PLEG): container finished" podID="0e1b87a4-8a2e-4d69-b940-39df820a2a61" containerID="b0a2c92d4114c57687fddfd1b595882f3246df62ba032fead209de790d82718d" exitCode=0 Nov 22 04:28:18 crc kubenswrapper[4699]: I1122 04:28:18.123083 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-b1b0-account-create-zfw5l" event={"ID":"0e1b87a4-8a2e-4d69-b940-39df820a2a61","Type":"ContainerDied","Data":"b0a2c92d4114c57687fddfd1b595882f3246df62ba032fead209de790d82718d"} Nov 22 04:28:18 crc kubenswrapper[4699]: I1122 04:28:18.161507 4699 generic.go:334] "Generic (PLEG): container finished" podID="ad04e92a-8275-4123-8b21-384b2f56cc3b" containerID="d3d3fe9c4e2a1ce9c52ff6646ef81226fd80658671f070e8c6123e626e9a0221" exitCode=0 Nov 22 04:28:18 crc kubenswrapper[4699]: I1122 04:28:18.161572 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-create-c94bq" event={"ID":"ad04e92a-8275-4123-8b21-384b2f56cc3b","Type":"ContainerDied","Data":"d3d3fe9c4e2a1ce9c52ff6646ef81226fd80658671f070e8c6123e626e9a0221"} Nov 22 04:28:18 crc kubenswrapper[4699]: I1122 04:28:18.384331 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Nov 22 04:28:18 crc kubenswrapper[4699]: I1122 04:28:18.385959 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 22 04:28:18 crc kubenswrapper[4699]: I1122 04:28:18.396823 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Nov 22 04:28:18 crc kubenswrapper[4699]: I1122 04:28:18.397205 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Nov 22 04:28:18 crc kubenswrapper[4699]: I1122 04:28:18.397376 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-8wc8d" Nov 22 04:28:18 crc kubenswrapper[4699]: I1122 04:28:18.403807 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 22 04:28:18 crc kubenswrapper[4699]: I1122 04:28:18.520502 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fb596c66-8d5f-4ed3-a053-2555488a57a8-openstack-config\") pod \"openstackclient\" (UID: \"fb596c66-8d5f-4ed3-a053-2555488a57a8\") " pod="openstack/openstackclient" Nov 22 04:28:18 crc kubenswrapper[4699]: I1122 04:28:18.520591 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw55t\" (UniqueName: \"kubernetes.io/projected/fb596c66-8d5f-4ed3-a053-2555488a57a8-kube-api-access-xw55t\") pod \"openstackclient\" (UID: \"fb596c66-8d5f-4ed3-a053-2555488a57a8\") " pod="openstack/openstackclient" Nov 22 04:28:18 crc kubenswrapper[4699]: I1122 04:28:18.520713 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fb596c66-8d5f-4ed3-a053-2555488a57a8-openstack-config-secret\") pod \"openstackclient\" (UID: \"fb596c66-8d5f-4ed3-a053-2555488a57a8\") " pod="openstack/openstackclient" Nov 22 04:28:18 crc kubenswrapper[4699]: I1122 04:28:18.520936 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb596c66-8d5f-4ed3-a053-2555488a57a8-combined-ca-bundle\") pod \"openstackclient\" (UID: \"fb596c66-8d5f-4ed3-a053-2555488a57a8\") " pod="openstack/openstackclient" Nov 22 04:28:18 crc kubenswrapper[4699]: I1122 04:28:18.622926 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb596c66-8d5f-4ed3-a053-2555488a57a8-combined-ca-bundle\") pod \"openstackclient\" (UID: \"fb596c66-8d5f-4ed3-a053-2555488a57a8\") " pod="openstack/openstackclient" Nov 22 04:28:18 crc kubenswrapper[4699]: I1122 04:28:18.623381 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fb596c66-8d5f-4ed3-a053-2555488a57a8-openstack-config\") pod \"openstackclient\" (UID: \"fb596c66-8d5f-4ed3-a053-2555488a57a8\") " pod="openstack/openstackclient" Nov 22 04:28:18 crc kubenswrapper[4699]: I1122 04:28:18.623499 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xw55t\" (UniqueName: \"kubernetes.io/projected/fb596c66-8d5f-4ed3-a053-2555488a57a8-kube-api-access-xw55t\") pod \"openstackclient\" (UID: \"fb596c66-8d5f-4ed3-a053-2555488a57a8\") " pod="openstack/openstackclient" Nov 22 04:28:18 crc kubenswrapper[4699]: I1122 04:28:18.623951 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fb596c66-8d5f-4ed3-a053-2555488a57a8-openstack-config-secret\") pod \"openstackclient\" (UID: \"fb596c66-8d5f-4ed3-a053-2555488a57a8\") " pod="openstack/openstackclient" Nov 22 04:28:18 crc kubenswrapper[4699]: I1122 04:28:18.624373 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fb596c66-8d5f-4ed3-a053-2555488a57a8-openstack-config\") pod \"openstackclient\" (UID: \"fb596c66-8d5f-4ed3-a053-2555488a57a8\") " pod="openstack/openstackclient" Nov 22 04:28:18 crc kubenswrapper[4699]: I1122 04:28:18.624899 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Nov 22 04:28:18 crc kubenswrapper[4699]: E1122 04:28:18.625461 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle kube-api-access-xw55t openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/openstackclient" podUID="fb596c66-8d5f-4ed3-a053-2555488a57a8" Nov 22 04:28:18 crc kubenswrapper[4699]: E1122 04:28:18.626809 4699 projected.go:194] Error preparing data for projected volume kube-api-access-xw55t for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: User "system:node:crc" cannot create resource "serviceaccounts/token" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Nov 22 04:28:18 crc kubenswrapper[4699]: E1122 04:28:18.626873 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fb596c66-8d5f-4ed3-a053-2555488a57a8-kube-api-access-xw55t podName:fb596c66-8d5f-4ed3-a053-2555488a57a8 nodeName:}" failed. No retries permitted until 2025-11-22 04:28:19.126852135 +0000 UTC m=+1250.469473392 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-xw55t" (UniqueName: "kubernetes.io/projected/fb596c66-8d5f-4ed3-a053-2555488a57a8-kube-api-access-xw55t") pod "openstackclient" (UID: "fb596c66-8d5f-4ed3-a053-2555488a57a8") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: User "system:node:crc" cannot create resource "serviceaccounts/token" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Nov 22 04:28:18 crc kubenswrapper[4699]: I1122 04:28:18.629902 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb596c66-8d5f-4ed3-a053-2555488a57a8-combined-ca-bundle\") pod \"openstackclient\" (UID: \"fb596c66-8d5f-4ed3-a053-2555488a57a8\") " pod="openstack/openstackclient" Nov 22 04:28:18 crc kubenswrapper[4699]: I1122 04:28:18.631865 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fb596c66-8d5f-4ed3-a053-2555488a57a8-openstack-config-secret\") pod \"openstackclient\" (UID: \"fb596c66-8d5f-4ed3-a053-2555488a57a8\") " pod="openstack/openstackclient" Nov 22 04:28:18 crc kubenswrapper[4699]: I1122 04:28:18.633232 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Nov 22 04:28:18 crc kubenswrapper[4699]: I1122 04:28:18.666420 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Nov 22 04:28:18 crc kubenswrapper[4699]: I1122 04:28:18.667999 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 22 04:28:18 crc kubenswrapper[4699]: I1122 04:28:18.699728 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 22 04:28:18 crc kubenswrapper[4699]: I1122 04:28:18.829345 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdc9s\" (UniqueName: \"kubernetes.io/projected/b3f3d84b-ad88-4145-9e18-b2baa8eff9c4-kube-api-access-xdc9s\") pod \"openstackclient\" (UID: \"b3f3d84b-ad88-4145-9e18-b2baa8eff9c4\") " pod="openstack/openstackclient" Nov 22 04:28:18 crc kubenswrapper[4699]: I1122 04:28:18.829417 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b3f3d84b-ad88-4145-9e18-b2baa8eff9c4-openstack-config-secret\") pod \"openstackclient\" (UID: \"b3f3d84b-ad88-4145-9e18-b2baa8eff9c4\") " pod="openstack/openstackclient" Nov 22 04:28:18 crc kubenswrapper[4699]: I1122 04:28:18.829559 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3f3d84b-ad88-4145-9e18-b2baa8eff9c4-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b3f3d84b-ad88-4145-9e18-b2baa8eff9c4\") " pod="openstack/openstackclient" Nov 22 04:28:18 crc kubenswrapper[4699]: I1122 04:28:18.829607 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b3f3d84b-ad88-4145-9e18-b2baa8eff9c4-openstack-config\") pod \"openstackclient\" (UID: \"b3f3d84b-ad88-4145-9e18-b2baa8eff9c4\") " pod="openstack/openstackclient" Nov 22 04:28:18 crc kubenswrapper[4699]: E1122 04:28:18.909871 4699 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod417b0282_cef1_4a7c_aca5_593297254fe3.slice/crio-d9be3bb7fa7a0781dffe600d9a6395a984bfdc0273d8e5fcae602ef27fcca7e9\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod417b0282_cef1_4a7c_aca5_593297254fe3.slice\": RecentStats: unable to find data in memory cache]" Nov 22 04:28:18 crc kubenswrapper[4699]: I1122 04:28:18.931272 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b3f3d84b-ad88-4145-9e18-b2baa8eff9c4-openstack-config-secret\") pod \"openstackclient\" (UID: \"b3f3d84b-ad88-4145-9e18-b2baa8eff9c4\") " pod="openstack/openstackclient" Nov 22 04:28:18 crc kubenswrapper[4699]: I1122 04:28:18.931822 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3f3d84b-ad88-4145-9e18-b2baa8eff9c4-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b3f3d84b-ad88-4145-9e18-b2baa8eff9c4\") " pod="openstack/openstackclient" Nov 22 04:28:18 crc kubenswrapper[4699]: I1122 04:28:18.932009 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b3f3d84b-ad88-4145-9e18-b2baa8eff9c4-openstack-config\") pod \"openstackclient\" (UID: \"b3f3d84b-ad88-4145-9e18-b2baa8eff9c4\") " pod="openstack/openstackclient" Nov 22 04:28:18 crc kubenswrapper[4699]: I1122 04:28:18.932381 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdc9s\" (UniqueName: \"kubernetes.io/projected/b3f3d84b-ad88-4145-9e18-b2baa8eff9c4-kube-api-access-xdc9s\") pod \"openstackclient\" (UID: \"b3f3d84b-ad88-4145-9e18-b2baa8eff9c4\") " pod="openstack/openstackclient" Nov 22 04:28:18 crc kubenswrapper[4699]: I1122 04:28:18.934446 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b3f3d84b-ad88-4145-9e18-b2baa8eff9c4-openstack-config\") pod \"openstackclient\" (UID: \"b3f3d84b-ad88-4145-9e18-b2baa8eff9c4\") " pod="openstack/openstackclient" Nov 22 04:28:18 crc kubenswrapper[4699]: I1122 04:28:18.945039 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b3f3d84b-ad88-4145-9e18-b2baa8eff9c4-openstack-config-secret\") pod \"openstackclient\" (UID: \"b3f3d84b-ad88-4145-9e18-b2baa8eff9c4\") " pod="openstack/openstackclient" Nov 22 04:28:18 crc kubenswrapper[4699]: I1122 04:28:18.947060 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3f3d84b-ad88-4145-9e18-b2baa8eff9c4-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b3f3d84b-ad88-4145-9e18-b2baa8eff9c4\") " pod="openstack/openstackclient" Nov 22 04:28:18 crc kubenswrapper[4699]: I1122 04:28:18.953088 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdc9s\" (UniqueName: \"kubernetes.io/projected/b3f3d84b-ad88-4145-9e18-b2baa8eff9c4-kube-api-access-xdc9s\") pod \"openstackclient\" (UID: \"b3f3d84b-ad88-4145-9e18-b2baa8eff9c4\") " pod="openstack/openstackclient" Nov 22 04:28:19 crc kubenswrapper[4699]: I1122 04:28:19.023390 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 22 04:28:19 crc kubenswrapper[4699]: I1122 04:28:19.137058 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xw55t\" (UniqueName: \"kubernetes.io/projected/fb596c66-8d5f-4ed3-a053-2555488a57a8-kube-api-access-xw55t\") pod \"openstackclient\" (UID: \"fb596c66-8d5f-4ed3-a053-2555488a57a8\") " pod="openstack/openstackclient" Nov 22 04:28:19 crc kubenswrapper[4699]: E1122 04:28:19.139472 4699 projected.go:194] Error preparing data for projected volume kube-api-access-xw55t for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (fb596c66-8d5f-4ed3-a053-2555488a57a8) does not match the UID in record. The object might have been deleted and then recreated Nov 22 04:28:19 crc kubenswrapper[4699]: E1122 04:28:19.139523 4699 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fb596c66-8d5f-4ed3-a053-2555488a57a8-kube-api-access-xw55t podName:fb596c66-8d5f-4ed3-a053-2555488a57a8 nodeName:}" failed. No retries permitted until 2025-11-22 04:28:20.139509224 +0000 UTC m=+1251.482130411 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-xw55t" (UniqueName: "kubernetes.io/projected/fb596c66-8d5f-4ed3-a053-2555488a57a8-kube-api-access-xw55t") pod "openstackclient" (UID: "fb596c66-8d5f-4ed3-a053-2555488a57a8") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (fb596c66-8d5f-4ed3-a053-2555488a57a8) does not match the UID in record. The object might have been deleted and then recreated Nov 22 04:28:19 crc kubenswrapper[4699]: I1122 04:28:19.182920 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-566cbdbc45-ld9jb" event={"ID":"4c5bbb47-8099-4bbb-b8a0-d2a56265522b","Type":"ContainerStarted","Data":"ca42143b0be85a769a61cfcdceea77627fec1a0f619e8b2a8301ce55128d7fdb"} Nov 22 04:28:19 crc kubenswrapper[4699]: I1122 04:28:19.185642 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"6b0a42c8-e8a1-45b3-9f29-77459d98ea4d","Type":"ContainerStarted","Data":"ccc2f25b642fefd6ab2d4c24729785b17b12aee45db7f3618f9d10c93d62ae82"} Nov 22 04:28:19 crc kubenswrapper[4699]: I1122 04:28:19.186164 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 22 04:28:19 crc kubenswrapper[4699]: I1122 04:28:19.190053 4699 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="fb596c66-8d5f-4ed3-a053-2555488a57a8" podUID="b3f3d84b-ad88-4145-9e18-b2baa8eff9c4" Nov 22 04:28:19 crc kubenswrapper[4699]: I1122 04:28:19.203951 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 22 04:28:19 crc kubenswrapper[4699]: I1122 04:28:19.339999 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fb596c66-8d5f-4ed3-a053-2555488a57a8-openstack-config\") pod \"fb596c66-8d5f-4ed3-a053-2555488a57a8\" (UID: \"fb596c66-8d5f-4ed3-a053-2555488a57a8\") " Nov 22 04:28:19 crc kubenswrapper[4699]: I1122 04:28:19.340221 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fb596c66-8d5f-4ed3-a053-2555488a57a8-openstack-config-secret\") pod \"fb596c66-8d5f-4ed3-a053-2555488a57a8\" (UID: \"fb596c66-8d5f-4ed3-a053-2555488a57a8\") " Nov 22 04:28:19 crc kubenswrapper[4699]: I1122 04:28:19.340281 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb596c66-8d5f-4ed3-a053-2555488a57a8-combined-ca-bundle\") pod \"fb596c66-8d5f-4ed3-a053-2555488a57a8\" (UID: \"fb596c66-8d5f-4ed3-a053-2555488a57a8\") " Nov 22 04:28:19 crc kubenswrapper[4699]: I1122 04:28:19.341176 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb596c66-8d5f-4ed3-a053-2555488a57a8-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "fb596c66-8d5f-4ed3-a053-2555488a57a8" (UID: "fb596c66-8d5f-4ed3-a053-2555488a57a8"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:28:19 crc kubenswrapper[4699]: I1122 04:28:19.342887 4699 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fb596c66-8d5f-4ed3-a053-2555488a57a8-openstack-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:19 crc kubenswrapper[4699]: I1122 04:28:19.342920 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xw55t\" (UniqueName: \"kubernetes.io/projected/fb596c66-8d5f-4ed3-a053-2555488a57a8-kube-api-access-xw55t\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:19 crc kubenswrapper[4699]: I1122 04:28:19.358323 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb596c66-8d5f-4ed3-a053-2555488a57a8-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "fb596c66-8d5f-4ed3-a053-2555488a57a8" (UID: "fb596c66-8d5f-4ed3-a053-2555488a57a8"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:28:19 crc kubenswrapper[4699]: I1122 04:28:19.358567 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb596c66-8d5f-4ed3-a053-2555488a57a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb596c66-8d5f-4ed3-a053-2555488a57a8" (UID: "fb596c66-8d5f-4ed3-a053-2555488a57a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:28:19 crc kubenswrapper[4699]: I1122 04:28:19.447572 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb596c66-8d5f-4ed3-a053-2555488a57a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:19 crc kubenswrapper[4699]: I1122 04:28:19.447921 4699 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fb596c66-8d5f-4ed3-a053-2555488a57a8-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:19 crc kubenswrapper[4699]: I1122 04:28:19.512555 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe" path="/var/lib/kubelet/pods/eea8b7c9-d91d-4267-a1ec-77fb7cf0a8fe/volumes" Nov 22 04:28:19 crc kubenswrapper[4699]: I1122 04:28:19.516277 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb596c66-8d5f-4ed3-a053-2555488a57a8" path="/var/lib/kubelet/pods/fb596c66-8d5f-4ed3-a053-2555488a57a8/volumes" Nov 22 04:28:20 crc kubenswrapper[4699]: I1122 04:28:20.198944 4699 generic.go:334] "Generic (PLEG): container finished" podID="6b0a42c8-e8a1-45b3-9f29-77459d98ea4d" containerID="ccc2f25b642fefd6ab2d4c24729785b17b12aee45db7f3618f9d10c93d62ae82" exitCode=0 Nov 22 04:28:20 crc kubenswrapper[4699]: I1122 04:28:20.199013 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"6b0a42c8-e8a1-45b3-9f29-77459d98ea4d","Type":"ContainerDied","Data":"ccc2f25b642fefd6ab2d4c24729785b17b12aee45db7f3618f9d10c93d62ae82"} Nov 22 04:28:20 crc kubenswrapper[4699]: I1122 04:28:20.210101 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-b1b0-account-create-zfw5l" event={"ID":"0e1b87a4-8a2e-4d69-b940-39df820a2a61","Type":"ContainerDied","Data":"1dd5decc54e1600a44a30c94303bbce13dc1d3905e24724e834329270d8d525f"} Nov 22 04:28:20 crc kubenswrapper[4699]: I1122 04:28:20.210131 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1dd5decc54e1600a44a30c94303bbce13dc1d3905e24724e834329270d8d525f" Nov 22 04:28:20 crc kubenswrapper[4699]: I1122 04:28:20.212253 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 22 04:28:20 crc kubenswrapper[4699]: I1122 04:28:20.212242 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-create-c94bq" event={"ID":"ad04e92a-8275-4123-8b21-384b2f56cc3b","Type":"ContainerDied","Data":"51d7eca3f531d91f56626d5b01ff2d4f49fcd11b44e07879db8dede42936525d"} Nov 22 04:28:20 crc kubenswrapper[4699]: I1122 04:28:20.212454 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51d7eca3f531d91f56626d5b01ff2d4f49fcd11b44e07879db8dede42936525d" Nov 22 04:28:20 crc kubenswrapper[4699]: I1122 04:28:20.383785 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-create-c94bq" Nov 22 04:28:20 crc kubenswrapper[4699]: I1122 04:28:20.465150 4699 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="fb596c66-8d5f-4ed3-a053-2555488a57a8" podUID="b3f3d84b-ad88-4145-9e18-b2baa8eff9c4" Nov 22 04:28:20 crc kubenswrapper[4699]: I1122 04:28:20.476003 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad04e92a-8275-4123-8b21-384b2f56cc3b-operator-scripts\") pod \"ad04e92a-8275-4123-8b21-384b2f56cc3b\" (UID: \"ad04e92a-8275-4123-8b21-384b2f56cc3b\") " Nov 22 04:28:20 crc kubenswrapper[4699]: I1122 04:28:20.476255 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j58p2\" (UniqueName: \"kubernetes.io/projected/ad04e92a-8275-4123-8b21-384b2f56cc3b-kube-api-access-j58p2\") pod \"ad04e92a-8275-4123-8b21-384b2f56cc3b\" (UID: \"ad04e92a-8275-4123-8b21-384b2f56cc3b\") " Nov 22 04:28:20 crc kubenswrapper[4699]: I1122 04:28:20.478131 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad04e92a-8275-4123-8b21-384b2f56cc3b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ad04e92a-8275-4123-8b21-384b2f56cc3b" (UID: "ad04e92a-8275-4123-8b21-384b2f56cc3b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:28:20 crc kubenswrapper[4699]: I1122 04:28:20.496912 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad04e92a-8275-4123-8b21-384b2f56cc3b-kube-api-access-j58p2" (OuterVolumeSpecName: "kube-api-access-j58p2") pod "ad04e92a-8275-4123-8b21-384b2f56cc3b" (UID: "ad04e92a-8275-4123-8b21-384b2f56cc3b"). InnerVolumeSpecName "kube-api-access-j58p2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:28:20 crc kubenswrapper[4699]: I1122 04:28:20.578540 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j58p2\" (UniqueName: \"kubernetes.io/projected/ad04e92a-8275-4123-8b21-384b2f56cc3b-kube-api-access-j58p2\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:20 crc kubenswrapper[4699]: I1122 04:28:20.578579 4699 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad04e92a-8275-4123-8b21-384b2f56cc3b-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:20 crc kubenswrapper[4699]: I1122 04:28:20.587969 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-b1b0-account-create-zfw5l" Nov 22 04:28:20 crc kubenswrapper[4699]: I1122 04:28:20.680331 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e1b87a4-8a2e-4d69-b940-39df820a2a61-operator-scripts\") pod \"0e1b87a4-8a2e-4d69-b940-39df820a2a61\" (UID: \"0e1b87a4-8a2e-4d69-b940-39df820a2a61\") " Nov 22 04:28:20 crc kubenswrapper[4699]: I1122 04:28:20.680480 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cfcj\" (UniqueName: \"kubernetes.io/projected/0e1b87a4-8a2e-4d69-b940-39df820a2a61-kube-api-access-6cfcj\") pod \"0e1b87a4-8a2e-4d69-b940-39df820a2a61\" (UID: \"0e1b87a4-8a2e-4d69-b940-39df820a2a61\") " Nov 22 04:28:20 crc kubenswrapper[4699]: I1122 04:28:20.681029 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e1b87a4-8a2e-4d69-b940-39df820a2a61-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0e1b87a4-8a2e-4d69-b940-39df820a2a61" (UID: "0e1b87a4-8a2e-4d69-b940-39df820a2a61"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:28:20 crc kubenswrapper[4699]: I1122 04:28:20.681419 4699 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e1b87a4-8a2e-4d69-b940-39df820a2a61-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:20 crc kubenswrapper[4699]: I1122 04:28:20.685356 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e1b87a4-8a2e-4d69-b940-39df820a2a61-kube-api-access-6cfcj" (OuterVolumeSpecName: "kube-api-access-6cfcj") pod "0e1b87a4-8a2e-4d69-b940-39df820a2a61" (UID: "0e1b87a4-8a2e-4d69-b940-39df820a2a61"). InnerVolumeSpecName "kube-api-access-6cfcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:28:20 crc kubenswrapper[4699]: I1122 04:28:20.787759 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cfcj\" (UniqueName: \"kubernetes.io/projected/0e1b87a4-8a2e-4d69-b940-39df820a2a61-kube-api-access-6cfcj\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:20 crc kubenswrapper[4699]: I1122 04:28:20.881647 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.113347 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.197077 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5396825b-8417-449a-90cb-c0755b9d83a4-config-data\") pod \"5396825b-8417-449a-90cb-c0755b9d83a4\" (UID: \"5396825b-8417-449a-90cb-c0755b9d83a4\") " Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.197162 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5396825b-8417-449a-90cb-c0755b9d83a4-scripts\") pod \"5396825b-8417-449a-90cb-c0755b9d83a4\" (UID: \"5396825b-8417-449a-90cb-c0755b9d83a4\") " Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.197190 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5396825b-8417-449a-90cb-c0755b9d83a4-etc-machine-id\") pod \"5396825b-8417-449a-90cb-c0755b9d83a4\" (UID: \"5396825b-8417-449a-90cb-c0755b9d83a4\") " Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.197229 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzp8k\" (UniqueName: \"kubernetes.io/projected/5396825b-8417-449a-90cb-c0755b9d83a4-kube-api-access-gzp8k\") pod \"5396825b-8417-449a-90cb-c0755b9d83a4\" (UID: \"5396825b-8417-449a-90cb-c0755b9d83a4\") " Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.197376 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5396825b-8417-449a-90cb-c0755b9d83a4-combined-ca-bundle\") pod \"5396825b-8417-449a-90cb-c0755b9d83a4\" (UID: \"5396825b-8417-449a-90cb-c0755b9d83a4\") " Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.197407 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5396825b-8417-449a-90cb-c0755b9d83a4-config-data-custom\") pod \"5396825b-8417-449a-90cb-c0755b9d83a4\" (UID: \"5396825b-8417-449a-90cb-c0755b9d83a4\") " Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.199965 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5396825b-8417-449a-90cb-c0755b9d83a4-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "5396825b-8417-449a-90cb-c0755b9d83a4" (UID: "5396825b-8417-449a-90cb-c0755b9d83a4"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.222903 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5396825b-8417-449a-90cb-c0755b9d83a4-scripts" (OuterVolumeSpecName: "scripts") pod "5396825b-8417-449a-90cb-c0755b9d83a4" (UID: "5396825b-8417-449a-90cb-c0755b9d83a4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.248347 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5396825b-8417-449a-90cb-c0755b9d83a4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5396825b-8417-449a-90cb-c0755b9d83a4" (UID: "5396825b-8417-449a-90cb-c0755b9d83a4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.248787 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5396825b-8417-449a-90cb-c0755b9d83a4-kube-api-access-gzp8k" (OuterVolumeSpecName: "kube-api-access-gzp8k") pod "5396825b-8417-449a-90cb-c0755b9d83a4" (UID: "5396825b-8417-449a-90cb-c0755b9d83a4"). InnerVolumeSpecName "kube-api-access-gzp8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.266597 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-554db96b96-4xcnr"] Nov 22 04:28:21 crc kubenswrapper[4699]: E1122 04:28:21.267030 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5396825b-8417-449a-90cb-c0755b9d83a4" containerName="cinder-scheduler" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.267043 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="5396825b-8417-449a-90cb-c0755b9d83a4" containerName="cinder-scheduler" Nov 22 04:28:21 crc kubenswrapper[4699]: E1122 04:28:21.267059 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5396825b-8417-449a-90cb-c0755b9d83a4" containerName="probe" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.267066 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="5396825b-8417-449a-90cb-c0755b9d83a4" containerName="probe" Nov 22 04:28:21 crc kubenswrapper[4699]: E1122 04:28:21.267083 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e1b87a4-8a2e-4d69-b940-39df820a2a61" containerName="mariadb-account-create" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.267089 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e1b87a4-8a2e-4d69-b940-39df820a2a61" containerName="mariadb-account-create" Nov 22 04:28:21 crc kubenswrapper[4699]: E1122 04:28:21.267112 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad04e92a-8275-4123-8b21-384b2f56cc3b" containerName="mariadb-database-create" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.267119 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad04e92a-8275-4123-8b21-384b2f56cc3b" containerName="mariadb-database-create" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.267285 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad04e92a-8275-4123-8b21-384b2f56cc3b" containerName="mariadb-database-create" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.267298 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="5396825b-8417-449a-90cb-c0755b9d83a4" containerName="cinder-scheduler" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.267309 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e1b87a4-8a2e-4d69-b940-39df820a2a61" containerName="mariadb-account-create" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.267323 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="5396825b-8417-449a-90cb-c0755b9d83a4" containerName="probe" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.268950 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-554db96b96-4xcnr" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.272712 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-public-svc" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.272850 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-internal-svc" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.276250 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-566cbdbc45-ld9jb" event={"ID":"4c5bbb47-8099-4bbb-b8a0-d2a56265522b","Type":"ContainerStarted","Data":"35f134f88bfab3d95d999c39021517091ac44bff0284f0e9b563a11f2b0a3de4"} Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.276380 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-566cbdbc45-ld9jb" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.278382 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-554db96b96-4xcnr"] Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.298469 4699 generic.go:334] "Generic (PLEG): container finished" podID="5396825b-8417-449a-90cb-c0755b9d83a4" containerID="c6063467628fb4837a805b00a933593595ff173bc8d1851e969145acc47224fe" exitCode=0 Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.298559 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5396825b-8417-449a-90cb-c0755b9d83a4","Type":"ContainerDied","Data":"c6063467628fb4837a805b00a933593595ff173bc8d1851e969145acc47224fe"} Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.298596 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5396825b-8417-449a-90cb-c0755b9d83a4","Type":"ContainerDied","Data":"b8915689f18ac708e2680563baabb7327fd4c432435e44e7543b317d5b8e20d8"} Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.298616 4699 scope.go:117] "RemoveContainer" containerID="eaf34eade050fc67216bfc310dd009497306c3245680da6401a1624a02697d32" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.298785 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.301812 4699 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5396825b-8417-449a-90cb-c0755b9d83a4-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.301846 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5396825b-8417-449a-90cb-c0755b9d83a4-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.301858 4699 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5396825b-8417-449a-90cb-c0755b9d83a4-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.301870 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzp8k\" (UniqueName: \"kubernetes.io/projected/5396825b-8417-449a-90cb-c0755b9d83a4-kube-api-access-gzp8k\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.322363 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b3f3d84b-ad88-4145-9e18-b2baa8eff9c4","Type":"ContainerStarted","Data":"0f4ad0c8819fac0375742ff46b3b3a20d6a70d845819023ce3815a89837731f5"} Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.338824 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5396825b-8417-449a-90cb-c0755b9d83a4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5396825b-8417-449a-90cb-c0755b9d83a4" (UID: "5396825b-8417-449a-90cb-c0755b9d83a4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.351930 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-794784977b-czv6j" event={"ID":"da52be58-8760-4d0a-866a-9eb3b47b2e8b","Type":"ContainerStarted","Data":"7ed953b0d80e9df6281772ea99e9ca5db990127d1c62e1cddc841220e453adca"} Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.356453 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-create-c94bq" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.358033 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-566cbdbc45-ld9jb" podStartSLOduration=5.358012321 podStartE2EDuration="5.358012321s" podCreationTimestamp="2025-11-22 04:28:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:28:21.345345034 +0000 UTC m=+1252.687966221" watchObservedRunningTime="2025-11-22 04:28:21.358012321 +0000 UTC m=+1252.700633528" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.360254 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-65957c9c4f-4rj2b" event={"ID":"474af2c7-c72f-4420-94a9-4876e0dbd68e","Type":"ContainerStarted","Data":"a505574fc6c1c81ade845e7ecd7a3e86581538f29889bd115174e6acfe5dbf52"} Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.360294 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-65957c9c4f-4rj2b" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.360372 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-b1b0-account-create-zfw5l" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.421027 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e01db47e-4633-40f5-ad23-14867d89eba8-public-tls-certs\") pod \"ironic-554db96b96-4xcnr\" (UID: \"e01db47e-4633-40f5-ad23-14867d89eba8\") " pod="openstack/ironic-554db96b96-4xcnr" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.421070 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e01db47e-4633-40f5-ad23-14867d89eba8-logs\") pod \"ironic-554db96b96-4xcnr\" (UID: \"e01db47e-4633-40f5-ad23-14867d89eba8\") " pod="openstack/ironic-554db96b96-4xcnr" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.421097 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqctq\" (UniqueName: \"kubernetes.io/projected/e01db47e-4633-40f5-ad23-14867d89eba8-kube-api-access-zqctq\") pod \"ironic-554db96b96-4xcnr\" (UID: \"e01db47e-4633-40f5-ad23-14867d89eba8\") " pod="openstack/ironic-554db96b96-4xcnr" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.421132 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e01db47e-4633-40f5-ad23-14867d89eba8-config-data-merged\") pod \"ironic-554db96b96-4xcnr\" (UID: \"e01db47e-4633-40f5-ad23-14867d89eba8\") " pod="openstack/ironic-554db96b96-4xcnr" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.421155 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/e01db47e-4633-40f5-ad23-14867d89eba8-etc-podinfo\") pod \"ironic-554db96b96-4xcnr\" (UID: \"e01db47e-4633-40f5-ad23-14867d89eba8\") " pod="openstack/ironic-554db96b96-4xcnr" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.421179 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e01db47e-4633-40f5-ad23-14867d89eba8-combined-ca-bundle\") pod \"ironic-554db96b96-4xcnr\" (UID: \"e01db47e-4633-40f5-ad23-14867d89eba8\") " pod="openstack/ironic-554db96b96-4xcnr" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.421195 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e01db47e-4633-40f5-ad23-14867d89eba8-scripts\") pod \"ironic-554db96b96-4xcnr\" (UID: \"e01db47e-4633-40f5-ad23-14867d89eba8\") " pod="openstack/ironic-554db96b96-4xcnr" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.421215 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e01db47e-4633-40f5-ad23-14867d89eba8-config-data-custom\") pod \"ironic-554db96b96-4xcnr\" (UID: \"e01db47e-4633-40f5-ad23-14867d89eba8\") " pod="openstack/ironic-554db96b96-4xcnr" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.421239 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e01db47e-4633-40f5-ad23-14867d89eba8-internal-tls-certs\") pod \"ironic-554db96b96-4xcnr\" (UID: \"e01db47e-4633-40f5-ad23-14867d89eba8\") " pod="openstack/ironic-554db96b96-4xcnr" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.421357 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e01db47e-4633-40f5-ad23-14867d89eba8-config-data\") pod \"ironic-554db96b96-4xcnr\" (UID: \"e01db47e-4633-40f5-ad23-14867d89eba8\") " pod="openstack/ironic-554db96b96-4xcnr" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.421473 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5396825b-8417-449a-90cb-c0755b9d83a4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.450604 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5396825b-8417-449a-90cb-c0755b9d83a4-config-data" (OuterVolumeSpecName: "config-data") pod "5396825b-8417-449a-90cb-c0755b9d83a4" (UID: "5396825b-8417-449a-90cb-c0755b9d83a4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.466214 4699 scope.go:117] "RemoveContainer" containerID="c6063467628fb4837a805b00a933593595ff173bc8d1851e969145acc47224fe" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.491461 4699 scope.go:117] "RemoveContainer" containerID="eaf34eade050fc67216bfc310dd009497306c3245680da6401a1624a02697d32" Nov 22 04:28:21 crc kubenswrapper[4699]: E1122 04:28:21.492058 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eaf34eade050fc67216bfc310dd009497306c3245680da6401a1624a02697d32\": container with ID starting with eaf34eade050fc67216bfc310dd009497306c3245680da6401a1624a02697d32 not found: ID does not exist" containerID="eaf34eade050fc67216bfc310dd009497306c3245680da6401a1624a02697d32" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.492109 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaf34eade050fc67216bfc310dd009497306c3245680da6401a1624a02697d32"} err="failed to get container status \"eaf34eade050fc67216bfc310dd009497306c3245680da6401a1624a02697d32\": rpc error: code = NotFound desc = could not find container \"eaf34eade050fc67216bfc310dd009497306c3245680da6401a1624a02697d32\": container with ID starting with eaf34eade050fc67216bfc310dd009497306c3245680da6401a1624a02697d32 not found: ID does not exist" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.492139 4699 scope.go:117] "RemoveContainer" containerID="c6063467628fb4837a805b00a933593595ff173bc8d1851e969145acc47224fe" Nov 22 04:28:21 crc kubenswrapper[4699]: E1122 04:28:21.492506 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6063467628fb4837a805b00a933593595ff173bc8d1851e969145acc47224fe\": container with ID starting with c6063467628fb4837a805b00a933593595ff173bc8d1851e969145acc47224fe not found: ID does not exist" containerID="c6063467628fb4837a805b00a933593595ff173bc8d1851e969145acc47224fe" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.492537 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6063467628fb4837a805b00a933593595ff173bc8d1851e969145acc47224fe"} err="failed to get container status \"c6063467628fb4837a805b00a933593595ff173bc8d1851e969145acc47224fe\": rpc error: code = NotFound desc = could not find container \"c6063467628fb4837a805b00a933593595ff173bc8d1851e969145acc47224fe\": container with ID starting with c6063467628fb4837a805b00a933593595ff173bc8d1851e969145acc47224fe not found: ID does not exist" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.492908 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-neutron-agent-65957c9c4f-4rj2b" podStartSLOduration=2.436841683 podStartE2EDuration="6.492892521s" podCreationTimestamp="2025-11-22 04:28:15 +0000 UTC" firstStartedPulling="2025-11-22 04:28:16.203311257 +0000 UTC m=+1247.545932434" lastFinishedPulling="2025-11-22 04:28:20.259362085 +0000 UTC m=+1251.601983272" observedRunningTime="2025-11-22 04:28:21.443594006 +0000 UTC m=+1252.786215193" watchObservedRunningTime="2025-11-22 04:28:21.492892521 +0000 UTC m=+1252.835513718" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.523923 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e01db47e-4633-40f5-ad23-14867d89eba8-public-tls-certs\") pod \"ironic-554db96b96-4xcnr\" (UID: \"e01db47e-4633-40f5-ad23-14867d89eba8\") " pod="openstack/ironic-554db96b96-4xcnr" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.524293 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e01db47e-4633-40f5-ad23-14867d89eba8-logs\") pod \"ironic-554db96b96-4xcnr\" (UID: \"e01db47e-4633-40f5-ad23-14867d89eba8\") " pod="openstack/ironic-554db96b96-4xcnr" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.524325 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqctq\" (UniqueName: \"kubernetes.io/projected/e01db47e-4633-40f5-ad23-14867d89eba8-kube-api-access-zqctq\") pod \"ironic-554db96b96-4xcnr\" (UID: \"e01db47e-4633-40f5-ad23-14867d89eba8\") " pod="openstack/ironic-554db96b96-4xcnr" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.524376 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e01db47e-4633-40f5-ad23-14867d89eba8-config-data-merged\") pod \"ironic-554db96b96-4xcnr\" (UID: \"e01db47e-4633-40f5-ad23-14867d89eba8\") " pod="openstack/ironic-554db96b96-4xcnr" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.524398 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/e01db47e-4633-40f5-ad23-14867d89eba8-etc-podinfo\") pod \"ironic-554db96b96-4xcnr\" (UID: \"e01db47e-4633-40f5-ad23-14867d89eba8\") " pod="openstack/ironic-554db96b96-4xcnr" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.524747 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e01db47e-4633-40f5-ad23-14867d89eba8-config-data-merged\") pod \"ironic-554db96b96-4xcnr\" (UID: \"e01db47e-4633-40f5-ad23-14867d89eba8\") " pod="openstack/ironic-554db96b96-4xcnr" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.524415 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e01db47e-4633-40f5-ad23-14867d89eba8-combined-ca-bundle\") pod \"ironic-554db96b96-4xcnr\" (UID: \"e01db47e-4633-40f5-ad23-14867d89eba8\") " pod="openstack/ironic-554db96b96-4xcnr" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.524799 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e01db47e-4633-40f5-ad23-14867d89eba8-scripts\") pod \"ironic-554db96b96-4xcnr\" (UID: \"e01db47e-4633-40f5-ad23-14867d89eba8\") " pod="openstack/ironic-554db96b96-4xcnr" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.524822 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e01db47e-4633-40f5-ad23-14867d89eba8-config-data-custom\") pod \"ironic-554db96b96-4xcnr\" (UID: \"e01db47e-4633-40f5-ad23-14867d89eba8\") " pod="openstack/ironic-554db96b96-4xcnr" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.524847 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e01db47e-4633-40f5-ad23-14867d89eba8-internal-tls-certs\") pod \"ironic-554db96b96-4xcnr\" (UID: \"e01db47e-4633-40f5-ad23-14867d89eba8\") " pod="openstack/ironic-554db96b96-4xcnr" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.524908 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e01db47e-4633-40f5-ad23-14867d89eba8-logs\") pod \"ironic-554db96b96-4xcnr\" (UID: \"e01db47e-4633-40f5-ad23-14867d89eba8\") " pod="openstack/ironic-554db96b96-4xcnr" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.525386 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e01db47e-4633-40f5-ad23-14867d89eba8-config-data\") pod \"ironic-554db96b96-4xcnr\" (UID: \"e01db47e-4633-40f5-ad23-14867d89eba8\") " pod="openstack/ironic-554db96b96-4xcnr" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.526340 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5396825b-8417-449a-90cb-c0755b9d83a4-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.536412 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e01db47e-4633-40f5-ad23-14867d89eba8-combined-ca-bundle\") pod \"ironic-554db96b96-4xcnr\" (UID: \"e01db47e-4633-40f5-ad23-14867d89eba8\") " pod="openstack/ironic-554db96b96-4xcnr" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.536891 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e01db47e-4633-40f5-ad23-14867d89eba8-scripts\") pod \"ironic-554db96b96-4xcnr\" (UID: \"e01db47e-4633-40f5-ad23-14867d89eba8\") " pod="openstack/ironic-554db96b96-4xcnr" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.536903 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e01db47e-4633-40f5-ad23-14867d89eba8-internal-tls-certs\") pod \"ironic-554db96b96-4xcnr\" (UID: \"e01db47e-4633-40f5-ad23-14867d89eba8\") " pod="openstack/ironic-554db96b96-4xcnr" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.539090 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqctq\" (UniqueName: \"kubernetes.io/projected/e01db47e-4633-40f5-ad23-14867d89eba8-kube-api-access-zqctq\") pod \"ironic-554db96b96-4xcnr\" (UID: \"e01db47e-4633-40f5-ad23-14867d89eba8\") " pod="openstack/ironic-554db96b96-4xcnr" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.539777 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e01db47e-4633-40f5-ad23-14867d89eba8-config-data-custom\") pod \"ironic-554db96b96-4xcnr\" (UID: \"e01db47e-4633-40f5-ad23-14867d89eba8\") " pod="openstack/ironic-554db96b96-4xcnr" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.544560 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/e01db47e-4633-40f5-ad23-14867d89eba8-etc-podinfo\") pod \"ironic-554db96b96-4xcnr\" (UID: \"e01db47e-4633-40f5-ad23-14867d89eba8\") " pod="openstack/ironic-554db96b96-4xcnr" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.545174 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e01db47e-4633-40f5-ad23-14867d89eba8-public-tls-certs\") pod \"ironic-554db96b96-4xcnr\" (UID: \"e01db47e-4633-40f5-ad23-14867d89eba8\") " pod="openstack/ironic-554db96b96-4xcnr" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.551264 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e01db47e-4633-40f5-ad23-14867d89eba8-config-data\") pod \"ironic-554db96b96-4xcnr\" (UID: \"e01db47e-4633-40f5-ad23-14867d89eba8\") " pod="openstack/ironic-554db96b96-4xcnr" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.637498 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.662513 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.701527 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.703151 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.705879 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.741307 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.757890 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-554db96b96-4xcnr" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.833816 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9b1d7c8-7353-480a-aa6f-7031b5228838-config-data\") pod \"cinder-scheduler-0\" (UID: \"a9b1d7c8-7353-480a-aa6f-7031b5228838\") " pod="openstack/cinder-scheduler-0" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.834198 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a9b1d7c8-7353-480a-aa6f-7031b5228838-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a9b1d7c8-7353-480a-aa6f-7031b5228838\") " pod="openstack/cinder-scheduler-0" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.834237 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9b1d7c8-7353-480a-aa6f-7031b5228838-scripts\") pod \"cinder-scheduler-0\" (UID: \"a9b1d7c8-7353-480a-aa6f-7031b5228838\") " pod="openstack/cinder-scheduler-0" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.834278 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a9b1d7c8-7353-480a-aa6f-7031b5228838-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a9b1d7c8-7353-480a-aa6f-7031b5228838\") " pod="openstack/cinder-scheduler-0" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.834355 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xds67\" (UniqueName: \"kubernetes.io/projected/a9b1d7c8-7353-480a-aa6f-7031b5228838-kube-api-access-xds67\") pod \"cinder-scheduler-0\" (UID: \"a9b1d7c8-7353-480a-aa6f-7031b5228838\") " pod="openstack/cinder-scheduler-0" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.834400 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9b1d7c8-7353-480a-aa6f-7031b5228838-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a9b1d7c8-7353-480a-aa6f-7031b5228838\") " pod="openstack/cinder-scheduler-0" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.936707 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xds67\" (UniqueName: \"kubernetes.io/projected/a9b1d7c8-7353-480a-aa6f-7031b5228838-kube-api-access-xds67\") pod \"cinder-scheduler-0\" (UID: \"a9b1d7c8-7353-480a-aa6f-7031b5228838\") " pod="openstack/cinder-scheduler-0" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.936796 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9b1d7c8-7353-480a-aa6f-7031b5228838-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a9b1d7c8-7353-480a-aa6f-7031b5228838\") " pod="openstack/cinder-scheduler-0" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.936883 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9b1d7c8-7353-480a-aa6f-7031b5228838-config-data\") pod \"cinder-scheduler-0\" (UID: \"a9b1d7c8-7353-480a-aa6f-7031b5228838\") " pod="openstack/cinder-scheduler-0" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.936907 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a9b1d7c8-7353-480a-aa6f-7031b5228838-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a9b1d7c8-7353-480a-aa6f-7031b5228838\") " pod="openstack/cinder-scheduler-0" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.936937 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9b1d7c8-7353-480a-aa6f-7031b5228838-scripts\") pod \"cinder-scheduler-0\" (UID: \"a9b1d7c8-7353-480a-aa6f-7031b5228838\") " pod="openstack/cinder-scheduler-0" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.936978 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a9b1d7c8-7353-480a-aa6f-7031b5228838-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a9b1d7c8-7353-480a-aa6f-7031b5228838\") " pod="openstack/cinder-scheduler-0" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.937118 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a9b1d7c8-7353-480a-aa6f-7031b5228838-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a9b1d7c8-7353-480a-aa6f-7031b5228838\") " pod="openstack/cinder-scheduler-0" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.947743 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9b1d7c8-7353-480a-aa6f-7031b5228838-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a9b1d7c8-7353-480a-aa6f-7031b5228838\") " pod="openstack/cinder-scheduler-0" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.954106 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9b1d7c8-7353-480a-aa6f-7031b5228838-scripts\") pod \"cinder-scheduler-0\" (UID: \"a9b1d7c8-7353-480a-aa6f-7031b5228838\") " pod="openstack/cinder-scheduler-0" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.963372 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a9b1d7c8-7353-480a-aa6f-7031b5228838-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a9b1d7c8-7353-480a-aa6f-7031b5228838\") " pod="openstack/cinder-scheduler-0" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.963963 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xds67\" (UniqueName: \"kubernetes.io/projected/a9b1d7c8-7353-480a-aa6f-7031b5228838-kube-api-access-xds67\") pod \"cinder-scheduler-0\" (UID: \"a9b1d7c8-7353-480a-aa6f-7031b5228838\") " pod="openstack/cinder-scheduler-0" Nov 22 04:28:21 crc kubenswrapper[4699]: I1122 04:28:21.979446 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9b1d7c8-7353-480a-aa6f-7031b5228838-config-data\") pod \"cinder-scheduler-0\" (UID: \"a9b1d7c8-7353-480a-aa6f-7031b5228838\") " pod="openstack/cinder-scheduler-0" Nov 22 04:28:22 crc kubenswrapper[4699]: I1122 04:28:22.025628 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 22 04:28:22 crc kubenswrapper[4699]: I1122 04:28:22.389713 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-554db96b96-4xcnr"] Nov 22 04:28:22 crc kubenswrapper[4699]: W1122 04:28:22.404409 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode01db47e_4633_40f5_ad23_14867d89eba8.slice/crio-5f873ede072aab874528d96f11ac3df1b64ccd8bf7eeffe6a58b312ca6386c54 WatchSource:0}: Error finding container 5f873ede072aab874528d96f11ac3df1b64ccd8bf7eeffe6a58b312ca6386c54: Status 404 returned error can't find the container with id 5f873ede072aab874528d96f11ac3df1b64ccd8bf7eeffe6a58b312ca6386c54 Nov 22 04:28:22 crc kubenswrapper[4699]: I1122 04:28:22.405122 4699 generic.go:334] "Generic (PLEG): container finished" podID="da52be58-8760-4d0a-866a-9eb3b47b2e8b" containerID="7ed953b0d80e9df6281772ea99e9ca5db990127d1c62e1cddc841220e453adca" exitCode=0 Nov 22 04:28:22 crc kubenswrapper[4699]: I1122 04:28:22.405223 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-794784977b-czv6j" event={"ID":"da52be58-8760-4d0a-866a-9eb3b47b2e8b","Type":"ContainerDied","Data":"7ed953b0d80e9df6281772ea99e9ca5db990127d1c62e1cddc841220e453adca"} Nov 22 04:28:22 crc kubenswrapper[4699]: I1122 04:28:22.579402 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 04:28:22 crc kubenswrapper[4699]: W1122 04:28:22.619335 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9b1d7c8_7353_480a_aa6f_7031b5228838.slice/crio-4a3bd01057e68ea21117eb7299290b59814af293b1dfc39e675e5651c81da285 WatchSource:0}: Error finding container 4a3bd01057e68ea21117eb7299290b59814af293b1dfc39e675e5651c81da285: Status 404 returned error can't find the container with id 4a3bd01057e68ea21117eb7299290b59814af293b1dfc39e675e5651c81da285 Nov 22 04:28:23 crc kubenswrapper[4699]: I1122 04:28:23.448312 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-794784977b-czv6j" event={"ID":"da52be58-8760-4d0a-866a-9eb3b47b2e8b","Type":"ContainerStarted","Data":"bf44202b19950bf84851868715d82d44b952b328642fa877c2f4cd0d72bfb00d"} Nov 22 04:28:23 crc kubenswrapper[4699]: I1122 04:28:23.448623 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-794784977b-czv6j" event={"ID":"da52be58-8760-4d0a-866a-9eb3b47b2e8b","Type":"ContainerStarted","Data":"8aa8d7aa239f9ad34eddcdcab904b5d20b5462a1c20e76b19bcaea7b9580021e"} Nov 22 04:28:23 crc kubenswrapper[4699]: I1122 04:28:23.450275 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-794784977b-czv6j" Nov 22 04:28:23 crc kubenswrapper[4699]: I1122 04:28:23.496772 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5396825b-8417-449a-90cb-c0755b9d83a4" path="/var/lib/kubelet/pods/5396825b-8417-449a-90cb-c0755b9d83a4/volumes" Nov 22 04:28:23 crc kubenswrapper[4699]: I1122 04:28:23.507941 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-554db96b96-4xcnr" event={"ID":"e01db47e-4633-40f5-ad23-14867d89eba8","Type":"ContainerStarted","Data":"b49e83537f56c6824d7559ec13fd53d7fe0b391ec49a71d4fe0e3cf3896d407d"} Nov 22 04:28:23 crc kubenswrapper[4699]: I1122 04:28:23.508012 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-554db96b96-4xcnr" event={"ID":"e01db47e-4633-40f5-ad23-14867d89eba8","Type":"ContainerStarted","Data":"5f873ede072aab874528d96f11ac3df1b64ccd8bf7eeffe6a58b312ca6386c54"} Nov 22 04:28:23 crc kubenswrapper[4699]: I1122 04:28:23.508028 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a9b1d7c8-7353-480a-aa6f-7031b5228838","Type":"ContainerStarted","Data":"4a3bd01057e68ea21117eb7299290b59814af293b1dfc39e675e5651c81da285"} Nov 22 04:28:23 crc kubenswrapper[4699]: I1122 04:28:23.517140 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-794784977b-czv6j" podStartSLOduration=4.664525844 podStartE2EDuration="8.517116029s" podCreationTimestamp="2025-11-22 04:28:15 +0000 UTC" firstStartedPulling="2025-11-22 04:28:16.408606285 +0000 UTC m=+1247.751227472" lastFinishedPulling="2025-11-22 04:28:20.26119647 +0000 UTC m=+1251.603817657" observedRunningTime="2025-11-22 04:28:23.471891542 +0000 UTC m=+1254.814512749" watchObservedRunningTime="2025-11-22 04:28:23.517116029 +0000 UTC m=+1254.859737226" Nov 22 04:28:24 crc kubenswrapper[4699]: I1122 04:28:24.130886 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Nov 22 04:28:24 crc kubenswrapper[4699]: I1122 04:28:24.367591 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-84dp4" Nov 22 04:28:24 crc kubenswrapper[4699]: I1122 04:28:24.429292 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-6dplx"] Nov 22 04:28:24 crc kubenswrapper[4699]: I1122 04:28:24.429535 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-6dplx" podUID="6e2eecb1-f103-46a7-9f37-5d2259df0703" containerName="dnsmasq-dns" containerID="cri-o://39015ecde0bec245bc5a9f0feade583fd348c1a4f7c5ba8df442d6ffdb40e3ee" gracePeriod=10 Nov 22 04:28:24 crc kubenswrapper[4699]: I1122 04:28:24.490337 4699 generic.go:334] "Generic (PLEG): container finished" podID="da52be58-8760-4d0a-866a-9eb3b47b2e8b" containerID="bf44202b19950bf84851868715d82d44b952b328642fa877c2f4cd0d72bfb00d" exitCode=1 Nov 22 04:28:24 crc kubenswrapper[4699]: I1122 04:28:24.490411 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-794784977b-czv6j" event={"ID":"da52be58-8760-4d0a-866a-9eb3b47b2e8b","Type":"ContainerDied","Data":"bf44202b19950bf84851868715d82d44b952b328642fa877c2f4cd0d72bfb00d"} Nov 22 04:28:24 crc kubenswrapper[4699]: I1122 04:28:24.491391 4699 scope.go:117] "RemoveContainer" containerID="bf44202b19950bf84851868715d82d44b952b328642fa877c2f4cd0d72bfb00d" Nov 22 04:28:24 crc kubenswrapper[4699]: I1122 04:28:24.513549 4699 generic.go:334] "Generic (PLEG): container finished" podID="e01db47e-4633-40f5-ad23-14867d89eba8" containerID="b49e83537f56c6824d7559ec13fd53d7fe0b391ec49a71d4fe0e3cf3896d407d" exitCode=0 Nov 22 04:28:24 crc kubenswrapper[4699]: I1122 04:28:24.513646 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-554db96b96-4xcnr" event={"ID":"e01db47e-4633-40f5-ad23-14867d89eba8","Type":"ContainerDied","Data":"b49e83537f56c6824d7559ec13fd53d7fe0b391ec49a71d4fe0e3cf3896d407d"} Nov 22 04:28:24 crc kubenswrapper[4699]: I1122 04:28:24.565203 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a9b1d7c8-7353-480a-aa6f-7031b5228838","Type":"ContainerStarted","Data":"e6cfaac17f2e3b3a966348f93a5d575c0359495d49f19b449ad2148811151a25"} Nov 22 04:28:25 crc kubenswrapper[4699]: I1122 04:28:25.337069 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-6dplx" Nov 22 04:28:25 crc kubenswrapper[4699]: I1122 04:28:25.481550 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e2eecb1-f103-46a7-9f37-5d2259df0703-dns-svc\") pod \"6e2eecb1-f103-46a7-9f37-5d2259df0703\" (UID: \"6e2eecb1-f103-46a7-9f37-5d2259df0703\") " Nov 22 04:28:25 crc kubenswrapper[4699]: I1122 04:28:25.482165 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e2eecb1-f103-46a7-9f37-5d2259df0703-ovsdbserver-nb\") pod \"6e2eecb1-f103-46a7-9f37-5d2259df0703\" (UID: \"6e2eecb1-f103-46a7-9f37-5d2259df0703\") " Nov 22 04:28:25 crc kubenswrapper[4699]: I1122 04:28:25.482659 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5cft\" (UniqueName: \"kubernetes.io/projected/6e2eecb1-f103-46a7-9f37-5d2259df0703-kube-api-access-d5cft\") pod \"6e2eecb1-f103-46a7-9f37-5d2259df0703\" (UID: \"6e2eecb1-f103-46a7-9f37-5d2259df0703\") " Nov 22 04:28:25 crc kubenswrapper[4699]: I1122 04:28:25.482696 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e2eecb1-f103-46a7-9f37-5d2259df0703-ovsdbserver-sb\") pod \"6e2eecb1-f103-46a7-9f37-5d2259df0703\" (UID: \"6e2eecb1-f103-46a7-9f37-5d2259df0703\") " Nov 22 04:28:25 crc kubenswrapper[4699]: I1122 04:28:25.482875 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e2eecb1-f103-46a7-9f37-5d2259df0703-config\") pod \"6e2eecb1-f103-46a7-9f37-5d2259df0703\" (UID: \"6e2eecb1-f103-46a7-9f37-5d2259df0703\") " Nov 22 04:28:25 crc kubenswrapper[4699]: I1122 04:28:25.482954 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6e2eecb1-f103-46a7-9f37-5d2259df0703-dns-swift-storage-0\") pod \"6e2eecb1-f103-46a7-9f37-5d2259df0703\" (UID: \"6e2eecb1-f103-46a7-9f37-5d2259df0703\") " Nov 22 04:28:25 crc kubenswrapper[4699]: I1122 04:28:25.490690 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e2eecb1-f103-46a7-9f37-5d2259df0703-kube-api-access-d5cft" (OuterVolumeSpecName: "kube-api-access-d5cft") pod "6e2eecb1-f103-46a7-9f37-5d2259df0703" (UID: "6e2eecb1-f103-46a7-9f37-5d2259df0703"). InnerVolumeSpecName "kube-api-access-d5cft". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:28:25 crc kubenswrapper[4699]: I1122 04:28:25.554844 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e2eecb1-f103-46a7-9f37-5d2259df0703-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6e2eecb1-f103-46a7-9f37-5d2259df0703" (UID: "6e2eecb1-f103-46a7-9f37-5d2259df0703"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:28:25 crc kubenswrapper[4699]: I1122 04:28:25.578138 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e2eecb1-f103-46a7-9f37-5d2259df0703-config" (OuterVolumeSpecName: "config") pod "6e2eecb1-f103-46a7-9f37-5d2259df0703" (UID: "6e2eecb1-f103-46a7-9f37-5d2259df0703"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:28:25 crc kubenswrapper[4699]: I1122 04:28:25.601580 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-neutron-agent-65957c9c4f-4rj2b" Nov 22 04:28:25 crc kubenswrapper[4699]: I1122 04:28:25.603401 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e2eecb1-f103-46a7-9f37-5d2259df0703-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:25 crc kubenswrapper[4699]: I1122 04:28:25.603425 4699 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e2eecb1-f103-46a7-9f37-5d2259df0703-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:25 crc kubenswrapper[4699]: I1122 04:28:25.603452 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5cft\" (UniqueName: \"kubernetes.io/projected/6e2eecb1-f103-46a7-9f37-5d2259df0703-kube-api-access-d5cft\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:25 crc kubenswrapper[4699]: I1122 04:28:25.606301 4699 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ironic-794784977b-czv6j" Nov 22 04:28:25 crc kubenswrapper[4699]: I1122 04:28:25.607251 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-64688bf4db-vwnwg" Nov 22 04:28:25 crc kubenswrapper[4699]: I1122 04:28:25.613975 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e2eecb1-f103-46a7-9f37-5d2259df0703-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6e2eecb1-f103-46a7-9f37-5d2259df0703" (UID: "6e2eecb1-f103-46a7-9f37-5d2259df0703"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:28:25 crc kubenswrapper[4699]: I1122 04:28:25.660560 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-794784977b-czv6j" event={"ID":"da52be58-8760-4d0a-866a-9eb3b47b2e8b","Type":"ContainerStarted","Data":"46bc820d607b57705b9d4e19c057ad7b8276c2f5d3ecc5eae1027faece236c71"} Nov 22 04:28:25 crc kubenswrapper[4699]: I1122 04:28:25.662648 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-794784977b-czv6j" Nov 22 04:28:25 crc kubenswrapper[4699]: I1122 04:28:25.672071 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e2eecb1-f103-46a7-9f37-5d2259df0703-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6e2eecb1-f103-46a7-9f37-5d2259df0703" (UID: "6e2eecb1-f103-46a7-9f37-5d2259df0703"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:28:25 crc kubenswrapper[4699]: I1122 04:28:25.673280 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-554db96b96-4xcnr" event={"ID":"e01db47e-4633-40f5-ad23-14867d89eba8","Type":"ContainerStarted","Data":"8f2b8d93c74ad264766016a94bdfe1385d4eba6bc86d32c91d8d095570e6bb54"} Nov 22 04:28:25 crc kubenswrapper[4699]: I1122 04:28:25.679192 4699 generic.go:334] "Generic (PLEG): container finished" podID="6e2eecb1-f103-46a7-9f37-5d2259df0703" containerID="39015ecde0bec245bc5a9f0feade583fd348c1a4f7c5ba8df442d6ffdb40e3ee" exitCode=0 Nov 22 04:28:25 crc kubenswrapper[4699]: I1122 04:28:25.679323 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-6dplx" event={"ID":"6e2eecb1-f103-46a7-9f37-5d2259df0703","Type":"ContainerDied","Data":"39015ecde0bec245bc5a9f0feade583fd348c1a4f7c5ba8df442d6ffdb40e3ee"} Nov 22 04:28:25 crc kubenswrapper[4699]: I1122 04:28:25.679488 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-6dplx" event={"ID":"6e2eecb1-f103-46a7-9f37-5d2259df0703","Type":"ContainerDied","Data":"68b223ea71390f4c1e9075f18ffc88a9f106ce78361a1cf478621e5c25407b7e"} Nov 22 04:28:25 crc kubenswrapper[4699]: I1122 04:28:25.679518 4699 scope.go:117] "RemoveContainer" containerID="39015ecde0bec245bc5a9f0feade583fd348c1a4f7c5ba8df442d6ffdb40e3ee" Nov 22 04:28:25 crc kubenswrapper[4699]: I1122 04:28:25.679534 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-6dplx" Nov 22 04:28:25 crc kubenswrapper[4699]: I1122 04:28:25.703200 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e2eecb1-f103-46a7-9f37-5d2259df0703-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6e2eecb1-f103-46a7-9f37-5d2259df0703" (UID: "6e2eecb1-f103-46a7-9f37-5d2259df0703"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:28:25 crc kubenswrapper[4699]: I1122 04:28:25.705339 4699 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6e2eecb1-f103-46a7-9f37-5d2259df0703-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:25 crc kubenswrapper[4699]: I1122 04:28:25.716608 4699 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e2eecb1-f103-46a7-9f37-5d2259df0703-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:25 crc kubenswrapper[4699]: I1122 04:28:25.716636 4699 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e2eecb1-f103-46a7-9f37-5d2259df0703-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:25 crc kubenswrapper[4699]: I1122 04:28:25.718073 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a9b1d7c8-7353-480a-aa6f-7031b5228838","Type":"ContainerStarted","Data":"1b3dad009f155f3b87251ab0047fe2a07f39c80b74052d218f15b307d2e6c349"} Nov 22 04:28:25 crc kubenswrapper[4699]: I1122 04:28:25.765237 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.765212443 podStartE2EDuration="4.765212443s" podCreationTimestamp="2025-11-22 04:28:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:28:25.753882158 +0000 UTC m=+1257.096503365" watchObservedRunningTime="2025-11-22 04:28:25.765212443 +0000 UTC m=+1257.107833630" Nov 22 04:28:25 crc kubenswrapper[4699]: I1122 04:28:25.888681 4699 scope.go:117] "RemoveContainer" containerID="05316b81d7e4221f94e481b7fa2ac5f38641454bda7c23f22e789d2d7af2e13a" Nov 22 04:28:25 crc kubenswrapper[4699]: I1122 04:28:25.908284 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-64688bf4db-vwnwg" Nov 22 04:28:25 crc kubenswrapper[4699]: I1122 04:28:25.951643 4699 scope.go:117] "RemoveContainer" containerID="39015ecde0bec245bc5a9f0feade583fd348c1a4f7c5ba8df442d6ffdb40e3ee" Nov 22 04:28:25 crc kubenswrapper[4699]: E1122 04:28:25.952546 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39015ecde0bec245bc5a9f0feade583fd348c1a4f7c5ba8df442d6ffdb40e3ee\": container with ID starting with 39015ecde0bec245bc5a9f0feade583fd348c1a4f7c5ba8df442d6ffdb40e3ee not found: ID does not exist" containerID="39015ecde0bec245bc5a9f0feade583fd348c1a4f7c5ba8df442d6ffdb40e3ee" Nov 22 04:28:25 crc kubenswrapper[4699]: I1122 04:28:25.952594 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39015ecde0bec245bc5a9f0feade583fd348c1a4f7c5ba8df442d6ffdb40e3ee"} err="failed to get container status \"39015ecde0bec245bc5a9f0feade583fd348c1a4f7c5ba8df442d6ffdb40e3ee\": rpc error: code = NotFound desc = could not find container \"39015ecde0bec245bc5a9f0feade583fd348c1a4f7c5ba8df442d6ffdb40e3ee\": container with ID starting with 39015ecde0bec245bc5a9f0feade583fd348c1a4f7c5ba8df442d6ffdb40e3ee not found: ID does not exist" Nov 22 04:28:25 crc kubenswrapper[4699]: I1122 04:28:25.952621 4699 scope.go:117] "RemoveContainer" containerID="05316b81d7e4221f94e481b7fa2ac5f38641454bda7c23f22e789d2d7af2e13a" Nov 22 04:28:25 crc kubenswrapper[4699]: E1122 04:28:25.953398 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05316b81d7e4221f94e481b7fa2ac5f38641454bda7c23f22e789d2d7af2e13a\": container with ID starting with 05316b81d7e4221f94e481b7fa2ac5f38641454bda7c23f22e789d2d7af2e13a not found: ID does not exist" containerID="05316b81d7e4221f94e481b7fa2ac5f38641454bda7c23f22e789d2d7af2e13a" Nov 22 04:28:25 crc kubenswrapper[4699]: I1122 04:28:25.953454 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05316b81d7e4221f94e481b7fa2ac5f38641454bda7c23f22e789d2d7af2e13a"} err="failed to get container status \"05316b81d7e4221f94e481b7fa2ac5f38641454bda7c23f22e789d2d7af2e13a\": rpc error: code = NotFound desc = could not find container \"05316b81d7e4221f94e481b7fa2ac5f38641454bda7c23f22e789d2d7af2e13a\": container with ID starting with 05316b81d7e4221f94e481b7fa2ac5f38641454bda7c23f22e789d2d7af2e13a not found: ID does not exist" Nov 22 04:28:26 crc kubenswrapper[4699]: I1122 04:28:26.039557 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-6dplx"] Nov 22 04:28:26 crc kubenswrapper[4699]: I1122 04:28:26.053505 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-6dplx"] Nov 22 04:28:26 crc kubenswrapper[4699]: I1122 04:28:26.495205 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 04:28:26 crc kubenswrapper[4699]: I1122 04:28:26.496035 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0ba3e973-312b-4343-a4c7-c6ab4a412703" containerName="ceilometer-central-agent" containerID="cri-o://c3ad54eab486e22378cf2f42efc53505fff8c1abc1e688c04363e261ec5bb886" gracePeriod=30 Nov 22 04:28:26 crc kubenswrapper[4699]: I1122 04:28:26.496199 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0ba3e973-312b-4343-a4c7-c6ab4a412703" containerName="sg-core" containerID="cri-o://50d65d3f9ebe4574d155557705664edbd18bac3c3589088e3c2f44d816f05d93" gracePeriod=30 Nov 22 04:28:26 crc kubenswrapper[4699]: I1122 04:28:26.496246 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0ba3e973-312b-4343-a4c7-c6ab4a412703" containerName="ceilometer-notification-agent" containerID="cri-o://00ee3fc918621b777b1c7549f562a1fe9dbd3a33ed4842f07528f68a5c4fd1de" gracePeriod=30 Nov 22 04:28:26 crc kubenswrapper[4699]: I1122 04:28:26.496206 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0ba3e973-312b-4343-a4c7-c6ab4a412703" containerName="proxy-httpd" containerID="cri-o://db342c311c0039efadf8c7766dc91c793c2268e558dc0b531bc482458151bfda" gracePeriod=30 Nov 22 04:28:26 crc kubenswrapper[4699]: I1122 04:28:26.530936 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="0ba3e973-312b-4343-a4c7-c6ab4a412703" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.157:3000/\": EOF" Nov 22 04:28:26 crc kubenswrapper[4699]: I1122 04:28:26.745082 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-554db96b96-4xcnr" event={"ID":"e01db47e-4633-40f5-ad23-14867d89eba8","Type":"ContainerStarted","Data":"33b2af4dff5f2cbde0b766e2ecbe0dff748d3a09cc63f408e2a87b3c4c8db08a"} Nov 22 04:28:26 crc kubenswrapper[4699]: I1122 04:28:26.745180 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-554db96b96-4xcnr" Nov 22 04:28:26 crc kubenswrapper[4699]: I1122 04:28:26.778265 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-554db96b96-4xcnr" podStartSLOduration=5.778248384 podStartE2EDuration="5.778248384s" podCreationTimestamp="2025-11-22 04:28:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:28:26.773234942 +0000 UTC m=+1258.115856129" watchObservedRunningTime="2025-11-22 04:28:26.778248384 +0000 UTC m=+1258.120869571" Nov 22 04:28:26 crc kubenswrapper[4699]: I1122 04:28:26.786859 4699 generic.go:334] "Generic (PLEG): container finished" podID="474af2c7-c72f-4420-94a9-4876e0dbd68e" containerID="a505574fc6c1c81ade845e7ecd7a3e86581538f29889bd115174e6acfe5dbf52" exitCode=1 Nov 22 04:28:26 crc kubenswrapper[4699]: I1122 04:28:26.787004 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-65957c9c4f-4rj2b" event={"ID":"474af2c7-c72f-4420-94a9-4876e0dbd68e","Type":"ContainerDied","Data":"a505574fc6c1c81ade845e7ecd7a3e86581538f29889bd115174e6acfe5dbf52"} Nov 22 04:28:26 crc kubenswrapper[4699]: I1122 04:28:26.787961 4699 scope.go:117] "RemoveContainer" containerID="a505574fc6c1c81ade845e7ecd7a3e86581538f29889bd115174e6acfe5dbf52" Nov 22 04:28:26 crc kubenswrapper[4699]: I1122 04:28:26.791732 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5997b85577-gkwmz"] Nov 22 04:28:26 crc kubenswrapper[4699]: E1122 04:28:26.792190 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e2eecb1-f103-46a7-9f37-5d2259df0703" containerName="dnsmasq-dns" Nov 22 04:28:26 crc kubenswrapper[4699]: I1122 04:28:26.792203 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e2eecb1-f103-46a7-9f37-5d2259df0703" containerName="dnsmasq-dns" Nov 22 04:28:26 crc kubenswrapper[4699]: E1122 04:28:26.792228 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e2eecb1-f103-46a7-9f37-5d2259df0703" containerName="init" Nov 22 04:28:26 crc kubenswrapper[4699]: I1122 04:28:26.792234 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e2eecb1-f103-46a7-9f37-5d2259df0703" containerName="init" Nov 22 04:28:26 crc kubenswrapper[4699]: I1122 04:28:26.792417 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e2eecb1-f103-46a7-9f37-5d2259df0703" containerName="dnsmasq-dns" Nov 22 04:28:26 crc kubenswrapper[4699]: I1122 04:28:26.802387 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5997b85577-gkwmz" Nov 22 04:28:26 crc kubenswrapper[4699]: I1122 04:28:26.821380 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Nov 22 04:28:26 crc kubenswrapper[4699]: I1122 04:28:26.821463 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Nov 22 04:28:26 crc kubenswrapper[4699]: I1122 04:28:26.821738 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Nov 22 04:28:26 crc kubenswrapper[4699]: I1122 04:28:26.844927 4699 generic.go:334] "Generic (PLEG): container finished" podID="0ba3e973-312b-4343-a4c7-c6ab4a412703" containerID="50d65d3f9ebe4574d155557705664edbd18bac3c3589088e3c2f44d816f05d93" exitCode=2 Nov 22 04:28:26 crc kubenswrapper[4699]: I1122 04:28:26.844991 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ba3e973-312b-4343-a4c7-c6ab4a412703","Type":"ContainerDied","Data":"50d65d3f9ebe4574d155557705664edbd18bac3c3589088e3c2f44d816f05d93"} Nov 22 04:28:26 crc kubenswrapper[4699]: I1122 04:28:26.861072 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5997b85577-gkwmz"] Nov 22 04:28:26 crc kubenswrapper[4699]: I1122 04:28:26.888241 4699 generic.go:334] "Generic (PLEG): container finished" podID="da52be58-8760-4d0a-866a-9eb3b47b2e8b" containerID="46bc820d607b57705b9d4e19c057ad7b8276c2f5d3ecc5eae1027faece236c71" exitCode=1 Nov 22 04:28:26 crc kubenswrapper[4699]: I1122 04:28:26.888636 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-794784977b-czv6j" event={"ID":"da52be58-8760-4d0a-866a-9eb3b47b2e8b","Type":"ContainerDied","Data":"46bc820d607b57705b9d4e19c057ad7b8276c2f5d3ecc5eae1027faece236c71"} Nov 22 04:28:26 crc kubenswrapper[4699]: I1122 04:28:26.888719 4699 scope.go:117] "RemoveContainer" containerID="bf44202b19950bf84851868715d82d44b952b328642fa877c2f4cd0d72bfb00d" Nov 22 04:28:26 crc kubenswrapper[4699]: I1122 04:28:26.894330 4699 scope.go:117] "RemoveContainer" containerID="46bc820d607b57705b9d4e19c057ad7b8276c2f5d3ecc5eae1027faece236c71" Nov 22 04:28:26 crc kubenswrapper[4699]: E1122 04:28:26.894918 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-api pod=ironic-794784977b-czv6j_openstack(da52be58-8760-4d0a-866a-9eb3b47b2e8b)\"" pod="openstack/ironic-794784977b-czv6j" podUID="da52be58-8760-4d0a-866a-9eb3b47b2e8b" Nov 22 04:28:26 crc kubenswrapper[4699]: I1122 04:28:26.948214 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3eaea68-e2a0-4b59-961e-eebded9815b1-internal-tls-certs\") pod \"swift-proxy-5997b85577-gkwmz\" (UID: \"f3eaea68-e2a0-4b59-961e-eebded9815b1\") " pod="openstack/swift-proxy-5997b85577-gkwmz" Nov 22 04:28:26 crc kubenswrapper[4699]: I1122 04:28:26.948271 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzczz\" (UniqueName: \"kubernetes.io/projected/f3eaea68-e2a0-4b59-961e-eebded9815b1-kube-api-access-fzczz\") pod \"swift-proxy-5997b85577-gkwmz\" (UID: \"f3eaea68-e2a0-4b59-961e-eebded9815b1\") " pod="openstack/swift-proxy-5997b85577-gkwmz" Nov 22 04:28:26 crc kubenswrapper[4699]: I1122 04:28:26.948354 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3eaea68-e2a0-4b59-961e-eebded9815b1-run-httpd\") pod \"swift-proxy-5997b85577-gkwmz\" (UID: \"f3eaea68-e2a0-4b59-961e-eebded9815b1\") " pod="openstack/swift-proxy-5997b85577-gkwmz" Nov 22 04:28:26 crc kubenswrapper[4699]: I1122 04:28:26.948401 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3eaea68-e2a0-4b59-961e-eebded9815b1-public-tls-certs\") pod \"swift-proxy-5997b85577-gkwmz\" (UID: \"f3eaea68-e2a0-4b59-961e-eebded9815b1\") " pod="openstack/swift-proxy-5997b85577-gkwmz" Nov 22 04:28:26 crc kubenswrapper[4699]: I1122 04:28:26.948474 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3eaea68-e2a0-4b59-961e-eebded9815b1-log-httpd\") pod \"swift-proxy-5997b85577-gkwmz\" (UID: \"f3eaea68-e2a0-4b59-961e-eebded9815b1\") " pod="openstack/swift-proxy-5997b85577-gkwmz" Nov 22 04:28:26 crc kubenswrapper[4699]: I1122 04:28:26.948531 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f3eaea68-e2a0-4b59-961e-eebded9815b1-etc-swift\") pod \"swift-proxy-5997b85577-gkwmz\" (UID: \"f3eaea68-e2a0-4b59-961e-eebded9815b1\") " pod="openstack/swift-proxy-5997b85577-gkwmz" Nov 22 04:28:26 crc kubenswrapper[4699]: I1122 04:28:26.948549 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3eaea68-e2a0-4b59-961e-eebded9815b1-config-data\") pod \"swift-proxy-5997b85577-gkwmz\" (UID: \"f3eaea68-e2a0-4b59-961e-eebded9815b1\") " pod="openstack/swift-proxy-5997b85577-gkwmz" Nov 22 04:28:26 crc kubenswrapper[4699]: I1122 04:28:26.948566 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3eaea68-e2a0-4b59-961e-eebded9815b1-combined-ca-bundle\") pod \"swift-proxy-5997b85577-gkwmz\" (UID: \"f3eaea68-e2a0-4b59-961e-eebded9815b1\") " pod="openstack/swift-proxy-5997b85577-gkwmz" Nov 22 04:28:27 crc kubenswrapper[4699]: I1122 04:28:27.026955 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Nov 22 04:28:27 crc kubenswrapper[4699]: I1122 04:28:27.050753 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzczz\" (UniqueName: \"kubernetes.io/projected/f3eaea68-e2a0-4b59-961e-eebded9815b1-kube-api-access-fzczz\") pod \"swift-proxy-5997b85577-gkwmz\" (UID: \"f3eaea68-e2a0-4b59-961e-eebded9815b1\") " pod="openstack/swift-proxy-5997b85577-gkwmz" Nov 22 04:28:27 crc kubenswrapper[4699]: I1122 04:28:27.051614 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3eaea68-e2a0-4b59-961e-eebded9815b1-run-httpd\") pod \"swift-proxy-5997b85577-gkwmz\" (UID: \"f3eaea68-e2a0-4b59-961e-eebded9815b1\") " pod="openstack/swift-proxy-5997b85577-gkwmz" Nov 22 04:28:27 crc kubenswrapper[4699]: I1122 04:28:27.051733 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3eaea68-e2a0-4b59-961e-eebded9815b1-public-tls-certs\") pod \"swift-proxy-5997b85577-gkwmz\" (UID: \"f3eaea68-e2a0-4b59-961e-eebded9815b1\") " pod="openstack/swift-proxy-5997b85577-gkwmz" Nov 22 04:28:27 crc kubenswrapper[4699]: I1122 04:28:27.051812 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3eaea68-e2a0-4b59-961e-eebded9815b1-log-httpd\") pod \"swift-proxy-5997b85577-gkwmz\" (UID: \"f3eaea68-e2a0-4b59-961e-eebded9815b1\") " pod="openstack/swift-proxy-5997b85577-gkwmz" Nov 22 04:28:27 crc kubenswrapper[4699]: I1122 04:28:27.051907 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f3eaea68-e2a0-4b59-961e-eebded9815b1-etc-swift\") pod \"swift-proxy-5997b85577-gkwmz\" (UID: \"f3eaea68-e2a0-4b59-961e-eebded9815b1\") " pod="openstack/swift-proxy-5997b85577-gkwmz" Nov 22 04:28:27 crc kubenswrapper[4699]: I1122 04:28:27.051933 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3eaea68-e2a0-4b59-961e-eebded9815b1-config-data\") pod \"swift-proxy-5997b85577-gkwmz\" (UID: \"f3eaea68-e2a0-4b59-961e-eebded9815b1\") " pod="openstack/swift-proxy-5997b85577-gkwmz" Nov 22 04:28:27 crc kubenswrapper[4699]: I1122 04:28:27.051972 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3eaea68-e2a0-4b59-961e-eebded9815b1-combined-ca-bundle\") pod \"swift-proxy-5997b85577-gkwmz\" (UID: \"f3eaea68-e2a0-4b59-961e-eebded9815b1\") " pod="openstack/swift-proxy-5997b85577-gkwmz" Nov 22 04:28:27 crc kubenswrapper[4699]: I1122 04:28:27.052019 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3eaea68-e2a0-4b59-961e-eebded9815b1-internal-tls-certs\") pod \"swift-proxy-5997b85577-gkwmz\" (UID: \"f3eaea68-e2a0-4b59-961e-eebded9815b1\") " pod="openstack/swift-proxy-5997b85577-gkwmz" Nov 22 04:28:27 crc kubenswrapper[4699]: I1122 04:28:27.052883 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3eaea68-e2a0-4b59-961e-eebded9815b1-run-httpd\") pod \"swift-proxy-5997b85577-gkwmz\" (UID: \"f3eaea68-e2a0-4b59-961e-eebded9815b1\") " pod="openstack/swift-proxy-5997b85577-gkwmz" Nov 22 04:28:27 crc kubenswrapper[4699]: I1122 04:28:27.055090 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3eaea68-e2a0-4b59-961e-eebded9815b1-log-httpd\") pod \"swift-proxy-5997b85577-gkwmz\" (UID: \"f3eaea68-e2a0-4b59-961e-eebded9815b1\") " pod="openstack/swift-proxy-5997b85577-gkwmz" Nov 22 04:28:27 crc kubenswrapper[4699]: I1122 04:28:27.060103 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3eaea68-e2a0-4b59-961e-eebded9815b1-combined-ca-bundle\") pod \"swift-proxy-5997b85577-gkwmz\" (UID: \"f3eaea68-e2a0-4b59-961e-eebded9815b1\") " pod="openstack/swift-proxy-5997b85577-gkwmz" Nov 22 04:28:27 crc kubenswrapper[4699]: I1122 04:28:27.061062 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3eaea68-e2a0-4b59-961e-eebded9815b1-internal-tls-certs\") pod \"swift-proxy-5997b85577-gkwmz\" (UID: \"f3eaea68-e2a0-4b59-961e-eebded9815b1\") " pod="openstack/swift-proxy-5997b85577-gkwmz" Nov 22 04:28:27 crc kubenswrapper[4699]: I1122 04:28:27.065138 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3eaea68-e2a0-4b59-961e-eebded9815b1-public-tls-certs\") pod \"swift-proxy-5997b85577-gkwmz\" (UID: \"f3eaea68-e2a0-4b59-961e-eebded9815b1\") " pod="openstack/swift-proxy-5997b85577-gkwmz" Nov 22 04:28:27 crc kubenswrapper[4699]: I1122 04:28:27.066910 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3eaea68-e2a0-4b59-961e-eebded9815b1-config-data\") pod \"swift-proxy-5997b85577-gkwmz\" (UID: \"f3eaea68-e2a0-4b59-961e-eebded9815b1\") " pod="openstack/swift-proxy-5997b85577-gkwmz" Nov 22 04:28:27 crc kubenswrapper[4699]: I1122 04:28:27.070997 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f3eaea68-e2a0-4b59-961e-eebded9815b1-etc-swift\") pod \"swift-proxy-5997b85577-gkwmz\" (UID: \"f3eaea68-e2a0-4b59-961e-eebded9815b1\") " pod="openstack/swift-proxy-5997b85577-gkwmz" Nov 22 04:28:27 crc kubenswrapper[4699]: I1122 04:28:27.084183 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzczz\" (UniqueName: \"kubernetes.io/projected/f3eaea68-e2a0-4b59-961e-eebded9815b1-kube-api-access-fzczz\") pod \"swift-proxy-5997b85577-gkwmz\" (UID: \"f3eaea68-e2a0-4b59-961e-eebded9815b1\") " pod="openstack/swift-proxy-5997b85577-gkwmz" Nov 22 04:28:27 crc kubenswrapper[4699]: I1122 04:28:27.165820 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5997b85577-gkwmz" Nov 22 04:28:27 crc kubenswrapper[4699]: I1122 04:28:27.490942 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e2eecb1-f103-46a7-9f37-5d2259df0703" path="/var/lib/kubelet/pods/6e2eecb1-f103-46a7-9f37-5d2259df0703/volumes" Nov 22 04:28:27 crc kubenswrapper[4699]: I1122 04:28:27.940878 4699 generic.go:334] "Generic (PLEG): container finished" podID="0ba3e973-312b-4343-a4c7-c6ab4a412703" containerID="db342c311c0039efadf8c7766dc91c793c2268e558dc0b531bc482458151bfda" exitCode=0 Nov 22 04:28:27 crc kubenswrapper[4699]: I1122 04:28:27.940956 4699 generic.go:334] "Generic (PLEG): container finished" podID="0ba3e973-312b-4343-a4c7-c6ab4a412703" containerID="c3ad54eab486e22378cf2f42efc53505fff8c1abc1e688c04363e261ec5bb886" exitCode=0 Nov 22 04:28:27 crc kubenswrapper[4699]: I1122 04:28:27.940972 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ba3e973-312b-4343-a4c7-c6ab4a412703","Type":"ContainerDied","Data":"db342c311c0039efadf8c7766dc91c793c2268e558dc0b531bc482458151bfda"} Nov 22 04:28:27 crc kubenswrapper[4699]: I1122 04:28:27.941103 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ba3e973-312b-4343-a4c7-c6ab4a412703","Type":"ContainerDied","Data":"c3ad54eab486e22378cf2f42efc53505fff8c1abc1e688c04363e261ec5bb886"} Nov 22 04:28:27 crc kubenswrapper[4699]: I1122 04:28:27.943779 4699 scope.go:117] "RemoveContainer" containerID="46bc820d607b57705b9d4e19c057ad7b8276c2f5d3ecc5eae1027faece236c71" Nov 22 04:28:27 crc kubenswrapper[4699]: E1122 04:28:27.944090 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-api pod=ironic-794784977b-czv6j_openstack(da52be58-8760-4d0a-866a-9eb3b47b2e8b)\"" pod="openstack/ironic-794784977b-czv6j" podUID="da52be58-8760-4d0a-866a-9eb3b47b2e8b" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.176946 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-vppfb"] Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.178291 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vppfb" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.192788 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-405a-account-create-grbrm"] Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.193994 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-405a-account-create-grbrm" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.197398 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.207467 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-vppfb"] Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.214824 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-405a-account-create-grbrm"] Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.274803 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-qf7bw"] Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.276855 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qf7bw" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.282465 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-qf7bw"] Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.296770 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dhn2\" (UniqueName: \"kubernetes.io/projected/4c24c357-369d-430b-a7ba-62783ed79d1f-kube-api-access-9dhn2\") pod \"nova-api-405a-account-create-grbrm\" (UID: \"4c24c357-369d-430b-a7ba-62783ed79d1f\") " pod="openstack/nova-api-405a-account-create-grbrm" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.296879 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c24c357-369d-430b-a7ba-62783ed79d1f-operator-scripts\") pod \"nova-api-405a-account-create-grbrm\" (UID: \"4c24c357-369d-430b-a7ba-62783ed79d1f\") " pod="openstack/nova-api-405a-account-create-grbrm" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.296973 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adab83b4-5c89-4ecd-af55-56492c7421b3-operator-scripts\") pod \"nova-api-db-create-vppfb\" (UID: \"adab83b4-5c89-4ecd-af55-56492c7421b3\") " pod="openstack/nova-api-db-create-vppfb" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.297005 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfqzd\" (UniqueName: \"kubernetes.io/projected/adab83b4-5c89-4ecd-af55-56492c7421b3-kube-api-access-nfqzd\") pod \"nova-api-db-create-vppfb\" (UID: \"adab83b4-5c89-4ecd-af55-56492c7421b3\") " pod="openstack/nova-api-db-create-vppfb" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.391734 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-mvxc4"] Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.393034 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-mvxc4" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.399527 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0a913d6-76de-4f73-bc38-83471deabfdb-operator-scripts\") pod \"nova-cell0-db-create-qf7bw\" (UID: \"a0a913d6-76de-4f73-bc38-83471deabfdb\") " pod="openstack/nova-cell0-db-create-qf7bw" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.399600 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adab83b4-5c89-4ecd-af55-56492c7421b3-operator-scripts\") pod \"nova-api-db-create-vppfb\" (UID: \"adab83b4-5c89-4ecd-af55-56492c7421b3\") " pod="openstack/nova-api-db-create-vppfb" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.399629 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfqzd\" (UniqueName: \"kubernetes.io/projected/adab83b4-5c89-4ecd-af55-56492c7421b3-kube-api-access-nfqzd\") pod \"nova-api-db-create-vppfb\" (UID: \"adab83b4-5c89-4ecd-af55-56492c7421b3\") " pod="openstack/nova-api-db-create-vppfb" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.399684 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8czpq\" (UniqueName: \"kubernetes.io/projected/a0a913d6-76de-4f73-bc38-83471deabfdb-kube-api-access-8czpq\") pod \"nova-cell0-db-create-qf7bw\" (UID: \"a0a913d6-76de-4f73-bc38-83471deabfdb\") " pod="openstack/nova-cell0-db-create-qf7bw" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.399723 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dhn2\" (UniqueName: \"kubernetes.io/projected/4c24c357-369d-430b-a7ba-62783ed79d1f-kube-api-access-9dhn2\") pod \"nova-api-405a-account-create-grbrm\" (UID: \"4c24c357-369d-430b-a7ba-62783ed79d1f\") " pod="openstack/nova-api-405a-account-create-grbrm" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.399926 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c24c357-369d-430b-a7ba-62783ed79d1f-operator-scripts\") pod \"nova-api-405a-account-create-grbrm\" (UID: \"4c24c357-369d-430b-a7ba-62783ed79d1f\") " pod="openstack/nova-api-405a-account-create-grbrm" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.400765 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c24c357-369d-430b-a7ba-62783ed79d1f-operator-scripts\") pod \"nova-api-405a-account-create-grbrm\" (UID: \"4c24c357-369d-430b-a7ba-62783ed79d1f\") " pod="openstack/nova-api-405a-account-create-grbrm" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.403126 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adab83b4-5c89-4ecd-af55-56492c7421b3-operator-scripts\") pod \"nova-api-db-create-vppfb\" (UID: \"adab83b4-5c89-4ecd-af55-56492c7421b3\") " pod="openstack/nova-api-db-create-vppfb" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.409956 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-5ae7-account-create-5gj27"] Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.412355 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5ae7-account-create-5gj27" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.417329 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.419818 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dhn2\" (UniqueName: \"kubernetes.io/projected/4c24c357-369d-430b-a7ba-62783ed79d1f-kube-api-access-9dhn2\") pod \"nova-api-405a-account-create-grbrm\" (UID: \"4c24c357-369d-430b-a7ba-62783ed79d1f\") " pod="openstack/nova-api-405a-account-create-grbrm" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.422405 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfqzd\" (UniqueName: \"kubernetes.io/projected/adab83b4-5c89-4ecd-af55-56492c7421b3-kube-api-access-nfqzd\") pod \"nova-api-db-create-vppfb\" (UID: \"adab83b4-5c89-4ecd-af55-56492c7421b3\") " pod="openstack/nova-api-db-create-vppfb" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.435094 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-mvxc4"] Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.458102 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-5ae7-account-create-5gj27"] Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.502083 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8czpq\" (UniqueName: \"kubernetes.io/projected/a0a913d6-76de-4f73-bc38-83471deabfdb-kube-api-access-8czpq\") pod \"nova-cell0-db-create-qf7bw\" (UID: \"a0a913d6-76de-4f73-bc38-83471deabfdb\") " pod="openstack/nova-cell0-db-create-qf7bw" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.502157 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdmmm\" (UniqueName: \"kubernetes.io/projected/e987a5a1-15e5-43db-b896-d68d46cf841d-kube-api-access-kdmmm\") pod \"nova-cell1-db-create-mvxc4\" (UID: \"e987a5a1-15e5-43db-b896-d68d46cf841d\") " pod="openstack/nova-cell1-db-create-mvxc4" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.502196 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e987a5a1-15e5-43db-b896-d68d46cf841d-operator-scripts\") pod \"nova-cell1-db-create-mvxc4\" (UID: \"e987a5a1-15e5-43db-b896-d68d46cf841d\") " pod="openstack/nova-cell1-db-create-mvxc4" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.502277 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0a913d6-76de-4f73-bc38-83471deabfdb-operator-scripts\") pod \"nova-cell0-db-create-qf7bw\" (UID: \"a0a913d6-76de-4f73-bc38-83471deabfdb\") " pod="openstack/nova-cell0-db-create-qf7bw" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.503118 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0a913d6-76de-4f73-bc38-83471deabfdb-operator-scripts\") pod \"nova-cell0-db-create-qf7bw\" (UID: \"a0a913d6-76de-4f73-bc38-83471deabfdb\") " pod="openstack/nova-cell0-db-create-qf7bw" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.528605 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8czpq\" (UniqueName: \"kubernetes.io/projected/a0a913d6-76de-4f73-bc38-83471deabfdb-kube-api-access-8czpq\") pod \"nova-cell0-db-create-qf7bw\" (UID: \"a0a913d6-76de-4f73-bc38-83471deabfdb\") " pod="openstack/nova-cell0-db-create-qf7bw" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.532991 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vppfb" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.555014 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-405a-account-create-grbrm" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.580806 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-5945-account-create-kgfl7"] Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.582406 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-5945-account-create-kgfl7" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.586924 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.604231 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2aab7b5b-e294-4282-a3b3-75d47c1e911d-operator-scripts\") pod \"nova-cell0-5ae7-account-create-5gj27\" (UID: \"2aab7b5b-e294-4282-a3b3-75d47c1e911d\") " pod="openstack/nova-cell0-5ae7-account-create-5gj27" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.604295 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr8fl\" (UniqueName: \"kubernetes.io/projected/2aab7b5b-e294-4282-a3b3-75d47c1e911d-kube-api-access-rr8fl\") pod \"nova-cell0-5ae7-account-create-5gj27\" (UID: \"2aab7b5b-e294-4282-a3b3-75d47c1e911d\") " pod="openstack/nova-cell0-5ae7-account-create-5gj27" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.604347 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdmmm\" (UniqueName: \"kubernetes.io/projected/e987a5a1-15e5-43db-b896-d68d46cf841d-kube-api-access-kdmmm\") pod \"nova-cell1-db-create-mvxc4\" (UID: \"e987a5a1-15e5-43db-b896-d68d46cf841d\") " pod="openstack/nova-cell1-db-create-mvxc4" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.604385 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e987a5a1-15e5-43db-b896-d68d46cf841d-operator-scripts\") pod \"nova-cell1-db-create-mvxc4\" (UID: \"e987a5a1-15e5-43db-b896-d68d46cf841d\") " pod="openstack/nova-cell1-db-create-mvxc4" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.605286 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e987a5a1-15e5-43db-b896-d68d46cf841d-operator-scripts\") pod \"nova-cell1-db-create-mvxc4\" (UID: \"e987a5a1-15e5-43db-b896-d68d46cf841d\") " pod="openstack/nova-cell1-db-create-mvxc4" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.617966 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qf7bw" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.624727 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-5945-account-create-kgfl7"] Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.642289 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdmmm\" (UniqueName: \"kubernetes.io/projected/e987a5a1-15e5-43db-b896-d68d46cf841d-kube-api-access-kdmmm\") pod \"nova-cell1-db-create-mvxc4\" (UID: \"e987a5a1-15e5-43db-b896-d68d46cf841d\") " pod="openstack/nova-cell1-db-create-mvxc4" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.709282 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2aab7b5b-e294-4282-a3b3-75d47c1e911d-operator-scripts\") pod \"nova-cell0-5ae7-account-create-5gj27\" (UID: \"2aab7b5b-e294-4282-a3b3-75d47c1e911d\") " pod="openstack/nova-cell0-5ae7-account-create-5gj27" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.710690 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr8fl\" (UniqueName: \"kubernetes.io/projected/2aab7b5b-e294-4282-a3b3-75d47c1e911d-kube-api-access-rr8fl\") pod \"nova-cell0-5ae7-account-create-5gj27\" (UID: \"2aab7b5b-e294-4282-a3b3-75d47c1e911d\") " pod="openstack/nova-cell0-5ae7-account-create-5gj27" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.710744 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d971429-cae7-4fed-9849-343ec7364f54-operator-scripts\") pod \"nova-cell1-5945-account-create-kgfl7\" (UID: \"4d971429-cae7-4fed-9849-343ec7364f54\") " pod="openstack/nova-cell1-5945-account-create-kgfl7" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.710829 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp4f8\" (UniqueName: \"kubernetes.io/projected/4d971429-cae7-4fed-9849-343ec7364f54-kube-api-access-wp4f8\") pod \"nova-cell1-5945-account-create-kgfl7\" (UID: \"4d971429-cae7-4fed-9849-343ec7364f54\") " pod="openstack/nova-cell1-5945-account-create-kgfl7" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.711852 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2aab7b5b-e294-4282-a3b3-75d47c1e911d-operator-scripts\") pod \"nova-cell0-5ae7-account-create-5gj27\" (UID: \"2aab7b5b-e294-4282-a3b3-75d47c1e911d\") " pod="openstack/nova-cell0-5ae7-account-create-5gj27" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.734552 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr8fl\" (UniqueName: \"kubernetes.io/projected/2aab7b5b-e294-4282-a3b3-75d47c1e911d-kube-api-access-rr8fl\") pod \"nova-cell0-5ae7-account-create-5gj27\" (UID: \"2aab7b5b-e294-4282-a3b3-75d47c1e911d\") " pod="openstack/nova-cell0-5ae7-account-create-5gj27" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.812313 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-mvxc4" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.813538 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d971429-cae7-4fed-9849-343ec7364f54-operator-scripts\") pod \"nova-cell1-5945-account-create-kgfl7\" (UID: \"4d971429-cae7-4fed-9849-343ec7364f54\") " pod="openstack/nova-cell1-5945-account-create-kgfl7" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.813588 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wp4f8\" (UniqueName: \"kubernetes.io/projected/4d971429-cae7-4fed-9849-343ec7364f54-kube-api-access-wp4f8\") pod \"nova-cell1-5945-account-create-kgfl7\" (UID: \"4d971429-cae7-4fed-9849-343ec7364f54\") " pod="openstack/nova-cell1-5945-account-create-kgfl7" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.814889 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d971429-cae7-4fed-9849-343ec7364f54-operator-scripts\") pod \"nova-cell1-5945-account-create-kgfl7\" (UID: \"4d971429-cae7-4fed-9849-343ec7364f54\") " pod="openstack/nova-cell1-5945-account-create-kgfl7" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.823975 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5ae7-account-create-5gj27" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.835685 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp4f8\" (UniqueName: \"kubernetes.io/projected/4d971429-cae7-4fed-9849-343ec7364f54-kube-api-access-wp4f8\") pod \"nova-cell1-5945-account-create-kgfl7\" (UID: \"4d971429-cae7-4fed-9849-343ec7364f54\") " pod="openstack/nova-cell1-5945-account-create-kgfl7" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.911876 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-5945-account-create-kgfl7" Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.986970 4699 generic.go:334] "Generic (PLEG): container finished" podID="0ba3e973-312b-4343-a4c7-c6ab4a412703" containerID="00ee3fc918621b777b1c7549f562a1fe9dbd3a33ed4842f07528f68a5c4fd1de" exitCode=0 Nov 22 04:28:28 crc kubenswrapper[4699]: I1122 04:28:28.987020 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ba3e973-312b-4343-a4c7-c6ab4a412703","Type":"ContainerDied","Data":"00ee3fc918621b777b1c7549f562a1fe9dbd3a33ed4842f07528f68a5c4fd1de"} Nov 22 04:28:29 crc kubenswrapper[4699]: E1122 04:28:29.212291 4699 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod417b0282_cef1_4a7c_aca5_593297254fe3.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod417b0282_cef1_4a7c_aca5_593297254fe3.slice/crio-d9be3bb7fa7a0781dffe600d9a6395a984bfdc0273d8e5fcae602ef27fcca7e9\": RecentStats: unable to find data in memory cache]" Nov 22 04:28:30 crc kubenswrapper[4699]: I1122 04:28:30.022260 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="0ba3e973-312b-4343-a4c7-c6ab4a412703" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.157:3000/\": dial tcp 10.217.0.157:3000: connect: connection refused" Nov 22 04:28:30 crc kubenswrapper[4699]: I1122 04:28:30.290001 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-db-sync-kdn8x"] Nov 22 04:28:30 crc kubenswrapper[4699]: I1122 04:28:30.292156 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-sync-kdn8x" Nov 22 04:28:30 crc kubenswrapper[4699]: I1122 04:28:30.300407 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-scripts" Nov 22 04:28:30 crc kubenswrapper[4699]: I1122 04:28:30.301198 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-config-data" Nov 22 04:28:30 crc kubenswrapper[4699]: I1122 04:28:30.306052 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-db-sync-kdn8x"] Nov 22 04:28:30 crc kubenswrapper[4699]: I1122 04:28:30.465106 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57ead407-5bf6-4cc4-ac17-e939d329f220-scripts\") pod \"ironic-inspector-db-sync-kdn8x\" (UID: \"57ead407-5bf6-4cc4-ac17-e939d329f220\") " pod="openstack/ironic-inspector-db-sync-kdn8x" Nov 22 04:28:30 crc kubenswrapper[4699]: I1122 04:28:30.465165 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/57ead407-5bf6-4cc4-ac17-e939d329f220-etc-podinfo\") pod \"ironic-inspector-db-sync-kdn8x\" (UID: \"57ead407-5bf6-4cc4-ac17-e939d329f220\") " pod="openstack/ironic-inspector-db-sync-kdn8x" Nov 22 04:28:30 crc kubenswrapper[4699]: I1122 04:28:30.465247 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/57ead407-5bf6-4cc4-ac17-e939d329f220-var-lib-ironic\") pod \"ironic-inspector-db-sync-kdn8x\" (UID: \"57ead407-5bf6-4cc4-ac17-e939d329f220\") " pod="openstack/ironic-inspector-db-sync-kdn8x" Nov 22 04:28:30 crc kubenswrapper[4699]: I1122 04:28:30.465286 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/57ead407-5bf6-4cc4-ac17-e939d329f220-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-db-sync-kdn8x\" (UID: \"57ead407-5bf6-4cc4-ac17-e939d329f220\") " pod="openstack/ironic-inspector-db-sync-kdn8x" Nov 22 04:28:30 crc kubenswrapper[4699]: I1122 04:28:30.465361 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57ead407-5bf6-4cc4-ac17-e939d329f220-combined-ca-bundle\") pod \"ironic-inspector-db-sync-kdn8x\" (UID: \"57ead407-5bf6-4cc4-ac17-e939d329f220\") " pod="openstack/ironic-inspector-db-sync-kdn8x" Nov 22 04:28:30 crc kubenswrapper[4699]: I1122 04:28:30.465407 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zdsp\" (UniqueName: \"kubernetes.io/projected/57ead407-5bf6-4cc4-ac17-e939d329f220-kube-api-access-5zdsp\") pod \"ironic-inspector-db-sync-kdn8x\" (UID: \"57ead407-5bf6-4cc4-ac17-e939d329f220\") " pod="openstack/ironic-inspector-db-sync-kdn8x" Nov 22 04:28:30 crc kubenswrapper[4699]: I1122 04:28:30.465496 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/57ead407-5bf6-4cc4-ac17-e939d329f220-config\") pod \"ironic-inspector-db-sync-kdn8x\" (UID: \"57ead407-5bf6-4cc4-ac17-e939d329f220\") " pod="openstack/ironic-inspector-db-sync-kdn8x" Nov 22 04:28:30 crc kubenswrapper[4699]: I1122 04:28:30.516728 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-65957c9c4f-4rj2b" Nov 22 04:28:30 crc kubenswrapper[4699]: I1122 04:28:30.516833 4699 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ironic-neutron-agent-65957c9c4f-4rj2b" Nov 22 04:28:30 crc kubenswrapper[4699]: I1122 04:28:30.567641 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/57ead407-5bf6-4cc4-ac17-e939d329f220-config\") pod \"ironic-inspector-db-sync-kdn8x\" (UID: \"57ead407-5bf6-4cc4-ac17-e939d329f220\") " pod="openstack/ironic-inspector-db-sync-kdn8x" Nov 22 04:28:30 crc kubenswrapper[4699]: I1122 04:28:30.567764 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57ead407-5bf6-4cc4-ac17-e939d329f220-scripts\") pod \"ironic-inspector-db-sync-kdn8x\" (UID: \"57ead407-5bf6-4cc4-ac17-e939d329f220\") " pod="openstack/ironic-inspector-db-sync-kdn8x" Nov 22 04:28:30 crc kubenswrapper[4699]: I1122 04:28:30.567790 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/57ead407-5bf6-4cc4-ac17-e939d329f220-etc-podinfo\") pod \"ironic-inspector-db-sync-kdn8x\" (UID: \"57ead407-5bf6-4cc4-ac17-e939d329f220\") " pod="openstack/ironic-inspector-db-sync-kdn8x" Nov 22 04:28:30 crc kubenswrapper[4699]: I1122 04:28:30.567846 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/57ead407-5bf6-4cc4-ac17-e939d329f220-var-lib-ironic\") pod \"ironic-inspector-db-sync-kdn8x\" (UID: \"57ead407-5bf6-4cc4-ac17-e939d329f220\") " pod="openstack/ironic-inspector-db-sync-kdn8x" Nov 22 04:28:30 crc kubenswrapper[4699]: I1122 04:28:30.567877 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/57ead407-5bf6-4cc4-ac17-e939d329f220-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-db-sync-kdn8x\" (UID: \"57ead407-5bf6-4cc4-ac17-e939d329f220\") " pod="openstack/ironic-inspector-db-sync-kdn8x" Nov 22 04:28:30 crc kubenswrapper[4699]: I1122 04:28:30.567936 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57ead407-5bf6-4cc4-ac17-e939d329f220-combined-ca-bundle\") pod \"ironic-inspector-db-sync-kdn8x\" (UID: \"57ead407-5bf6-4cc4-ac17-e939d329f220\") " pod="openstack/ironic-inspector-db-sync-kdn8x" Nov 22 04:28:30 crc kubenswrapper[4699]: I1122 04:28:30.567972 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zdsp\" (UniqueName: \"kubernetes.io/projected/57ead407-5bf6-4cc4-ac17-e939d329f220-kube-api-access-5zdsp\") pod \"ironic-inspector-db-sync-kdn8x\" (UID: \"57ead407-5bf6-4cc4-ac17-e939d329f220\") " pod="openstack/ironic-inspector-db-sync-kdn8x" Nov 22 04:28:30 crc kubenswrapper[4699]: I1122 04:28:30.568860 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/57ead407-5bf6-4cc4-ac17-e939d329f220-var-lib-ironic\") pod \"ironic-inspector-db-sync-kdn8x\" (UID: \"57ead407-5bf6-4cc4-ac17-e939d329f220\") " pod="openstack/ironic-inspector-db-sync-kdn8x" Nov 22 04:28:30 crc kubenswrapper[4699]: I1122 04:28:30.569111 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/57ead407-5bf6-4cc4-ac17-e939d329f220-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-db-sync-kdn8x\" (UID: \"57ead407-5bf6-4cc4-ac17-e939d329f220\") " pod="openstack/ironic-inspector-db-sync-kdn8x" Nov 22 04:28:30 crc kubenswrapper[4699]: I1122 04:28:30.576223 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57ead407-5bf6-4cc4-ac17-e939d329f220-combined-ca-bundle\") pod \"ironic-inspector-db-sync-kdn8x\" (UID: \"57ead407-5bf6-4cc4-ac17-e939d329f220\") " pod="openstack/ironic-inspector-db-sync-kdn8x" Nov 22 04:28:30 crc kubenswrapper[4699]: I1122 04:28:30.577355 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57ead407-5bf6-4cc4-ac17-e939d329f220-scripts\") pod \"ironic-inspector-db-sync-kdn8x\" (UID: \"57ead407-5bf6-4cc4-ac17-e939d329f220\") " pod="openstack/ironic-inspector-db-sync-kdn8x" Nov 22 04:28:30 crc kubenswrapper[4699]: I1122 04:28:30.578191 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/57ead407-5bf6-4cc4-ac17-e939d329f220-config\") pod \"ironic-inspector-db-sync-kdn8x\" (UID: \"57ead407-5bf6-4cc4-ac17-e939d329f220\") " pod="openstack/ironic-inspector-db-sync-kdn8x" Nov 22 04:28:30 crc kubenswrapper[4699]: I1122 04:28:30.583061 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/57ead407-5bf6-4cc4-ac17-e939d329f220-etc-podinfo\") pod \"ironic-inspector-db-sync-kdn8x\" (UID: \"57ead407-5bf6-4cc4-ac17-e939d329f220\") " pod="openstack/ironic-inspector-db-sync-kdn8x" Nov 22 04:28:30 crc kubenswrapper[4699]: I1122 04:28:30.590261 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zdsp\" (UniqueName: \"kubernetes.io/projected/57ead407-5bf6-4cc4-ac17-e939d329f220-kube-api-access-5zdsp\") pod \"ironic-inspector-db-sync-kdn8x\" (UID: \"57ead407-5bf6-4cc4-ac17-e939d329f220\") " pod="openstack/ironic-inspector-db-sync-kdn8x" Nov 22 04:28:30 crc kubenswrapper[4699]: I1122 04:28:30.593756 4699 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ironic-794784977b-czv6j" Nov 22 04:28:30 crc kubenswrapper[4699]: I1122 04:28:30.594722 4699 scope.go:117] "RemoveContainer" containerID="46bc820d607b57705b9d4e19c057ad7b8276c2f5d3ecc5eae1027faece236c71" Nov 22 04:28:30 crc kubenswrapper[4699]: E1122 04:28:30.594981 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-api pod=ironic-794784977b-czv6j_openstack(da52be58-8760-4d0a-866a-9eb3b47b2e8b)\"" pod="openstack/ironic-794784977b-czv6j" podUID="da52be58-8760-4d0a-866a-9eb3b47b2e8b" Nov 22 04:28:30 crc kubenswrapper[4699]: I1122 04:28:30.621801 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-sync-kdn8x" Nov 22 04:28:31 crc kubenswrapper[4699]: I1122 04:28:31.114073 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 04:28:31 crc kubenswrapper[4699]: I1122 04:28:31.145778 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-405a-account-create-grbrm"] Nov 22 04:28:31 crc kubenswrapper[4699]: I1122 04:28:31.202065 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59vjt\" (UniqueName: \"kubernetes.io/projected/0ba3e973-312b-4343-a4c7-c6ab4a412703-kube-api-access-59vjt\") pod \"0ba3e973-312b-4343-a4c7-c6ab4a412703\" (UID: \"0ba3e973-312b-4343-a4c7-c6ab4a412703\") " Nov 22 04:28:31 crc kubenswrapper[4699]: I1122 04:28:31.203280 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ba3e973-312b-4343-a4c7-c6ab4a412703-log-httpd\") pod \"0ba3e973-312b-4343-a4c7-c6ab4a412703\" (UID: \"0ba3e973-312b-4343-a4c7-c6ab4a412703\") " Nov 22 04:28:31 crc kubenswrapper[4699]: I1122 04:28:31.203347 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ba3e973-312b-4343-a4c7-c6ab4a412703-combined-ca-bundle\") pod \"0ba3e973-312b-4343-a4c7-c6ab4a412703\" (UID: \"0ba3e973-312b-4343-a4c7-c6ab4a412703\") " Nov 22 04:28:31 crc kubenswrapper[4699]: I1122 04:28:31.203393 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0ba3e973-312b-4343-a4c7-c6ab4a412703-sg-core-conf-yaml\") pod \"0ba3e973-312b-4343-a4c7-c6ab4a412703\" (UID: \"0ba3e973-312b-4343-a4c7-c6ab4a412703\") " Nov 22 04:28:31 crc kubenswrapper[4699]: I1122 04:28:31.203530 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ba3e973-312b-4343-a4c7-c6ab4a412703-config-data\") pod \"0ba3e973-312b-4343-a4c7-c6ab4a412703\" (UID: \"0ba3e973-312b-4343-a4c7-c6ab4a412703\") " Nov 22 04:28:31 crc kubenswrapper[4699]: I1122 04:28:31.203578 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ba3e973-312b-4343-a4c7-c6ab4a412703-scripts\") pod \"0ba3e973-312b-4343-a4c7-c6ab4a412703\" (UID: \"0ba3e973-312b-4343-a4c7-c6ab4a412703\") " Nov 22 04:28:31 crc kubenswrapper[4699]: I1122 04:28:31.203636 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ba3e973-312b-4343-a4c7-c6ab4a412703-run-httpd\") pod \"0ba3e973-312b-4343-a4c7-c6ab4a412703\" (UID: \"0ba3e973-312b-4343-a4c7-c6ab4a412703\") " Nov 22 04:28:31 crc kubenswrapper[4699]: I1122 04:28:31.205861 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ba3e973-312b-4343-a4c7-c6ab4a412703-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0ba3e973-312b-4343-a4c7-c6ab4a412703" (UID: "0ba3e973-312b-4343-a4c7-c6ab4a412703"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:28:31 crc kubenswrapper[4699]: I1122 04:28:31.206299 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ba3e973-312b-4343-a4c7-c6ab4a412703-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0ba3e973-312b-4343-a4c7-c6ab4a412703" (UID: "0ba3e973-312b-4343-a4c7-c6ab4a412703"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:28:31 crc kubenswrapper[4699]: I1122 04:28:31.211142 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ba3e973-312b-4343-a4c7-c6ab4a412703-kube-api-access-59vjt" (OuterVolumeSpecName: "kube-api-access-59vjt") pod "0ba3e973-312b-4343-a4c7-c6ab4a412703" (UID: "0ba3e973-312b-4343-a4c7-c6ab4a412703"). InnerVolumeSpecName "kube-api-access-59vjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:28:31 crc kubenswrapper[4699]: I1122 04:28:31.228195 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ba3e973-312b-4343-a4c7-c6ab4a412703-scripts" (OuterVolumeSpecName: "scripts") pod "0ba3e973-312b-4343-a4c7-c6ab4a412703" (UID: "0ba3e973-312b-4343-a4c7-c6ab4a412703"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:28:31 crc kubenswrapper[4699]: I1122 04:28:31.305541 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ba3e973-312b-4343-a4c7-c6ab4a412703-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0ba3e973-312b-4343-a4c7-c6ab4a412703" (UID: "0ba3e973-312b-4343-a4c7-c6ab4a412703"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:28:31 crc kubenswrapper[4699]: I1122 04:28:31.305748 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0ba3e973-312b-4343-a4c7-c6ab4a412703-sg-core-conf-yaml\") pod \"0ba3e973-312b-4343-a4c7-c6ab4a412703\" (UID: \"0ba3e973-312b-4343-a4c7-c6ab4a412703\") " Nov 22 04:28:31 crc kubenswrapper[4699]: I1122 04:28:31.306304 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ba3e973-312b-4343-a4c7-c6ab4a412703-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:31 crc kubenswrapper[4699]: I1122 04:28:31.306317 4699 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ba3e973-312b-4343-a4c7-c6ab4a412703-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:31 crc kubenswrapper[4699]: I1122 04:28:31.306326 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59vjt\" (UniqueName: \"kubernetes.io/projected/0ba3e973-312b-4343-a4c7-c6ab4a412703-kube-api-access-59vjt\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:31 crc kubenswrapper[4699]: I1122 04:28:31.306339 4699 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ba3e973-312b-4343-a4c7-c6ab4a412703-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:31 crc kubenswrapper[4699]: W1122 04:28:31.306362 4699 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/0ba3e973-312b-4343-a4c7-c6ab4a412703/volumes/kubernetes.io~secret/sg-core-conf-yaml Nov 22 04:28:31 crc kubenswrapper[4699]: I1122 04:28:31.306422 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ba3e973-312b-4343-a4c7-c6ab4a412703-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0ba3e973-312b-4343-a4c7-c6ab4a412703" (UID: "0ba3e973-312b-4343-a4c7-c6ab4a412703"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:28:31 crc kubenswrapper[4699]: W1122 04:28:31.322231 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c24c357_369d_430b_a7ba_62783ed79d1f.slice/crio-1f247118571ce12d62e5a96857539c78a48316ca4ab2aedd11f0aa1c497ea3aa WatchSource:0}: Error finding container 1f247118571ce12d62e5a96857539c78a48316ca4ab2aedd11f0aa1c497ea3aa: Status 404 returned error can't find the container with id 1f247118571ce12d62e5a96857539c78a48316ca4ab2aedd11f0aa1c497ea3aa Nov 22 04:28:31 crc kubenswrapper[4699]: I1122 04:28:31.381648 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Nov 22 04:28:31 crc kubenswrapper[4699]: I1122 04:28:31.408172 4699 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0ba3e973-312b-4343-a4c7-c6ab4a412703-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:31 crc kubenswrapper[4699]: I1122 04:28:31.498174 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ba3e973-312b-4343-a4c7-c6ab4a412703-config-data" (OuterVolumeSpecName: "config-data") pod "0ba3e973-312b-4343-a4c7-c6ab4a412703" (UID: "0ba3e973-312b-4343-a4c7-c6ab4a412703"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:28:31 crc kubenswrapper[4699]: I1122 04:28:31.522994 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ba3e973-312b-4343-a4c7-c6ab4a412703-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:31 crc kubenswrapper[4699]: I1122 04:28:31.540446 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ba3e973-312b-4343-a4c7-c6ab4a412703-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0ba3e973-312b-4343-a4c7-c6ab4a412703" (UID: "0ba3e973-312b-4343-a4c7-c6ab4a412703"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:28:31 crc kubenswrapper[4699]: I1122 04:28:31.625673 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ba3e973-312b-4343-a4c7-c6ab4a412703-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:31 crc kubenswrapper[4699]: I1122 04:28:31.920161 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-qf7bw"] Nov 22 04:28:31 crc kubenswrapper[4699]: I1122 04:28:31.933774 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-5ae7-account-create-5gj27"] Nov 22 04:28:31 crc kubenswrapper[4699]: W1122 04:28:31.936261 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2aab7b5b_e294_4282_a3b3_75d47c1e911d.slice/crio-2568b7c01e3987b8dd763183ef3273ddc4b96a38ea315d505109ccbc630689fe WatchSource:0}: Error finding container 2568b7c01e3987b8dd763183ef3273ddc4b96a38ea315d505109ccbc630689fe: Status 404 returned error can't find the container with id 2568b7c01e3987b8dd763183ef3273ddc4b96a38ea315d505109ccbc630689fe Nov 22 04:28:31 crc kubenswrapper[4699]: W1122 04:28:31.939510 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0a913d6_76de_4f73_bc38_83471deabfdb.slice/crio-b0979dcc665e39c4ed80d4e6dc4a13a4a6ac74dad00a16eb5cebb30f54d10b8e WatchSource:0}: Error finding container b0979dcc665e39c4ed80d4e6dc4a13a4a6ac74dad00a16eb5cebb30f54d10b8e: Status 404 returned error can't find the container with id b0979dcc665e39c4ed80d4e6dc4a13a4a6ac74dad00a16eb5cebb30f54d10b8e Nov 22 04:28:31 crc kubenswrapper[4699]: I1122 04:28:31.946534 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Nov 22 04:28:31 crc kubenswrapper[4699]: I1122 04:28:31.994993 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-db-sync-kdn8x"] Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.046227 4699 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.112981 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ba3e973-312b-4343-a4c7-c6ab4a412703","Type":"ContainerDied","Data":"12ea25681f2e105e75ac9c61ea5fbff5851db9a5984db83cb477a0790f60921a"} Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.113033 4699 scope.go:117] "RemoveContainer" containerID="db342c311c0039efadf8c7766dc91c793c2268e558dc0b531bc482458151bfda" Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.113444 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.121173 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-5ae7-account-create-5gj27" event={"ID":"2aab7b5b-e294-4282-a3b3-75d47c1e911d","Type":"ContainerStarted","Data":"2568b7c01e3987b8dd763183ef3273ddc4b96a38ea315d505109ccbc630689fe"} Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.132040 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-5945-account-create-kgfl7"] Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.141027 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.141112 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-sync-kdn8x" event={"ID":"57ead407-5bf6-4cc4-ac17-e939d329f220","Type":"ContainerStarted","Data":"850b4b7ae1c0c871a828822fb8002c58f7c553e12318fc39694bff68ecec7d5b"} Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.144988 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-mvxc4"] Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.155170 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5997b85577-gkwmz"] Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.166083 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-vppfb"] Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.175696 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-65957c9c4f-4rj2b" event={"ID":"474af2c7-c72f-4420-94a9-4876e0dbd68e","Type":"ContainerStarted","Data":"da2cb93d6c05afd426db6f8dd4d552a0f13e61dab834856411f3ac66c81de07a"} Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.178035 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-65957c9c4f-4rj2b" Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.214830 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qf7bw" event={"ID":"a0a913d6-76de-4f73-bc38-83471deabfdb","Type":"ContainerStarted","Data":"b0979dcc665e39c4ed80d4e6dc4a13a4a6ac74dad00a16eb5cebb30f54d10b8e"} Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.234898 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.245652 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.247745 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-405a-account-create-grbrm" event={"ID":"4c24c357-369d-430b-a7ba-62783ed79d1f","Type":"ContainerStarted","Data":"1a0fba0263d2d23aed3014908bb57b46d4434eddf45ad1eb1da0cf9a68a9aa48"} Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.247789 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-405a-account-create-grbrm" event={"ID":"4c24c357-369d-430b-a7ba-62783ed79d1f","Type":"ContainerStarted","Data":"1f247118571ce12d62e5a96857539c78a48316ca4ab2aedd11f0aa1c497ea3aa"} Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.253068 4699 scope.go:117] "RemoveContainer" containerID="50d65d3f9ebe4574d155557705664edbd18bac3c3589088e3c2f44d816f05d93" Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.283252 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 22 04:28:32 crc kubenswrapper[4699]: E1122 04:28:32.283680 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ba3e973-312b-4343-a4c7-c6ab4a412703" containerName="ceilometer-central-agent" Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.283697 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ba3e973-312b-4343-a4c7-c6ab4a412703" containerName="ceilometer-central-agent" Nov 22 04:28:32 crc kubenswrapper[4699]: E1122 04:28:32.283716 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ba3e973-312b-4343-a4c7-c6ab4a412703" containerName="sg-core" Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.283723 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ba3e973-312b-4343-a4c7-c6ab4a412703" containerName="sg-core" Nov 22 04:28:32 crc kubenswrapper[4699]: E1122 04:28:32.283738 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ba3e973-312b-4343-a4c7-c6ab4a412703" containerName="ceilometer-notification-agent" Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.283744 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ba3e973-312b-4343-a4c7-c6ab4a412703" containerName="ceilometer-notification-agent" Nov 22 04:28:32 crc kubenswrapper[4699]: E1122 04:28:32.283758 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ba3e973-312b-4343-a4c7-c6ab4a412703" containerName="proxy-httpd" Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.283764 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ba3e973-312b-4343-a4c7-c6ab4a412703" containerName="proxy-httpd" Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.283951 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ba3e973-312b-4343-a4c7-c6ab4a412703" containerName="proxy-httpd" Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.283971 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ba3e973-312b-4343-a4c7-c6ab4a412703" containerName="ceilometer-central-agent" Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.283981 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ba3e973-312b-4343-a4c7-c6ab4a412703" containerName="ceilometer-notification-agent" Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.283993 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ba3e973-312b-4343-a4c7-c6ab4a412703" containerName="sg-core" Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.285934 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.293310 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.293556 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.305564 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.314288 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-405a-account-create-grbrm" podStartSLOduration=4.314265423 podStartE2EDuration="4.314265423s" podCreationTimestamp="2025-11-22 04:28:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:28:32.264139788 +0000 UTC m=+1263.606760975" watchObservedRunningTime="2025-11-22 04:28:32.314265423 +0000 UTC m=+1263.656886610" Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.349752 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/832e24bf-7e8a-4e3f-be16-acd8ecff139d-log-httpd\") pod \"ceilometer-0\" (UID: \"832e24bf-7e8a-4e3f-be16-acd8ecff139d\") " pod="openstack/ceilometer-0" Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.349805 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/832e24bf-7e8a-4e3f-be16-acd8ecff139d-config-data\") pod \"ceilometer-0\" (UID: \"832e24bf-7e8a-4e3f-be16-acd8ecff139d\") " pod="openstack/ceilometer-0" Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.349820 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/832e24bf-7e8a-4e3f-be16-acd8ecff139d-run-httpd\") pod \"ceilometer-0\" (UID: \"832e24bf-7e8a-4e3f-be16-acd8ecff139d\") " pod="openstack/ceilometer-0" Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.349914 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/832e24bf-7e8a-4e3f-be16-acd8ecff139d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"832e24bf-7e8a-4e3f-be16-acd8ecff139d\") " pod="openstack/ceilometer-0" Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.349940 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/832e24bf-7e8a-4e3f-be16-acd8ecff139d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"832e24bf-7e8a-4e3f-be16-acd8ecff139d\") " pod="openstack/ceilometer-0" Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.350249 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9cc9\" (UniqueName: \"kubernetes.io/projected/832e24bf-7e8a-4e3f-be16-acd8ecff139d-kube-api-access-q9cc9\") pod \"ceilometer-0\" (UID: \"832e24bf-7e8a-4e3f-be16-acd8ecff139d\") " pod="openstack/ceilometer-0" Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.350268 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/832e24bf-7e8a-4e3f-be16-acd8ecff139d-scripts\") pod \"ceilometer-0\" (UID: \"832e24bf-7e8a-4e3f-be16-acd8ecff139d\") " pod="openstack/ceilometer-0" Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.380665 4699 scope.go:117] "RemoveContainer" containerID="00ee3fc918621b777b1c7549f562a1fe9dbd3a33ed4842f07528f68a5c4fd1de" Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.431804 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.457559 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/832e24bf-7e8a-4e3f-be16-acd8ecff139d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"832e24bf-7e8a-4e3f-be16-acd8ecff139d\") " pod="openstack/ceilometer-0" Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.457611 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/832e24bf-7e8a-4e3f-be16-acd8ecff139d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"832e24bf-7e8a-4e3f-be16-acd8ecff139d\") " pod="openstack/ceilometer-0" Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.457677 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9cc9\" (UniqueName: \"kubernetes.io/projected/832e24bf-7e8a-4e3f-be16-acd8ecff139d-kube-api-access-q9cc9\") pod \"ceilometer-0\" (UID: \"832e24bf-7e8a-4e3f-be16-acd8ecff139d\") " pod="openstack/ceilometer-0" Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.457696 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/832e24bf-7e8a-4e3f-be16-acd8ecff139d-scripts\") pod \"ceilometer-0\" (UID: \"832e24bf-7e8a-4e3f-be16-acd8ecff139d\") " pod="openstack/ceilometer-0" Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.463629 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/832e24bf-7e8a-4e3f-be16-acd8ecff139d-log-httpd\") pod \"ceilometer-0\" (UID: \"832e24bf-7e8a-4e3f-be16-acd8ecff139d\") " pod="openstack/ceilometer-0" Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.463672 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/832e24bf-7e8a-4e3f-be16-acd8ecff139d-config-data\") pod \"ceilometer-0\" (UID: \"832e24bf-7e8a-4e3f-be16-acd8ecff139d\") " pod="openstack/ceilometer-0" Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.463691 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/832e24bf-7e8a-4e3f-be16-acd8ecff139d-run-httpd\") pod \"ceilometer-0\" (UID: \"832e24bf-7e8a-4e3f-be16-acd8ecff139d\") " pod="openstack/ceilometer-0" Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.464094 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/832e24bf-7e8a-4e3f-be16-acd8ecff139d-log-httpd\") pod \"ceilometer-0\" (UID: \"832e24bf-7e8a-4e3f-be16-acd8ecff139d\") " pod="openstack/ceilometer-0" Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.464470 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/832e24bf-7e8a-4e3f-be16-acd8ecff139d-run-httpd\") pod \"ceilometer-0\" (UID: \"832e24bf-7e8a-4e3f-be16-acd8ecff139d\") " pod="openstack/ceilometer-0" Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.488232 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/832e24bf-7e8a-4e3f-be16-acd8ecff139d-scripts\") pod \"ceilometer-0\" (UID: \"832e24bf-7e8a-4e3f-be16-acd8ecff139d\") " pod="openstack/ceilometer-0" Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.497466 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/832e24bf-7e8a-4e3f-be16-acd8ecff139d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"832e24bf-7e8a-4e3f-be16-acd8ecff139d\") " pod="openstack/ceilometer-0" Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.499394 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/832e24bf-7e8a-4e3f-be16-acd8ecff139d-config-data\") pod \"ceilometer-0\" (UID: \"832e24bf-7e8a-4e3f-be16-acd8ecff139d\") " pod="openstack/ceilometer-0" Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.500935 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/832e24bf-7e8a-4e3f-be16-acd8ecff139d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"832e24bf-7e8a-4e3f-be16-acd8ecff139d\") " pod="openstack/ceilometer-0" Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.501596 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9cc9\" (UniqueName: \"kubernetes.io/projected/832e24bf-7e8a-4e3f-be16-acd8ecff139d-kube-api-access-q9cc9\") pod \"ceilometer-0\" (UID: \"832e24bf-7e8a-4e3f-be16-acd8ecff139d\") " pod="openstack/ceilometer-0" Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.562725 4699 scope.go:117] "RemoveContainer" containerID="c3ad54eab486e22378cf2f42efc53505fff8c1abc1e688c04363e261ec5bb886" Nov 22 04:28:32 crc kubenswrapper[4699]: I1122 04:28:32.615977 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 04:28:33 crc kubenswrapper[4699]: I1122 04:28:33.244917 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 04:28:33 crc kubenswrapper[4699]: I1122 04:28:33.284137 4699 generic.go:334] "Generic (PLEG): container finished" podID="a0a913d6-76de-4f73-bc38-83471deabfdb" containerID="220d7837f924ca7d5e47951feefec2cc78626f5372dfc8470215283969de5d6f" exitCode=0 Nov 22 04:28:33 crc kubenswrapper[4699]: I1122 04:28:33.284642 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qf7bw" event={"ID":"a0a913d6-76de-4f73-bc38-83471deabfdb","Type":"ContainerDied","Data":"220d7837f924ca7d5e47951feefec2cc78626f5372dfc8470215283969de5d6f"} Nov 22 04:28:33 crc kubenswrapper[4699]: I1122 04:28:33.290881 4699 generic.go:334] "Generic (PLEG): container finished" podID="2aab7b5b-e294-4282-a3b3-75d47c1e911d" containerID="fbb886ee9e6ed61f05098b0796808eef56e6ceb040d06eb51b58021dc4e58977" exitCode=0 Nov 22 04:28:33 crc kubenswrapper[4699]: I1122 04:28:33.290933 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-5ae7-account-create-5gj27" event={"ID":"2aab7b5b-e294-4282-a3b3-75d47c1e911d","Type":"ContainerDied","Data":"fbb886ee9e6ed61f05098b0796808eef56e6ceb040d06eb51b58021dc4e58977"} Nov 22 04:28:33 crc kubenswrapper[4699]: I1122 04:28:33.300086 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5997b85577-gkwmz" event={"ID":"f3eaea68-e2a0-4b59-961e-eebded9815b1","Type":"ContainerStarted","Data":"191bbe52bf59b9934750e9d3b1bdc3973ffd69cf1c1521b2516d03821eb30eaa"} Nov 22 04:28:33 crc kubenswrapper[4699]: I1122 04:28:33.300138 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5997b85577-gkwmz" event={"ID":"f3eaea68-e2a0-4b59-961e-eebded9815b1","Type":"ContainerStarted","Data":"6893d96a5430fb26672c3abbc0780a921c6fed7aabf61ecc54b52e8106a1517e"} Nov 22 04:28:33 crc kubenswrapper[4699]: I1122 04:28:33.319203 4699 generic.go:334] "Generic (PLEG): container finished" podID="4c24c357-369d-430b-a7ba-62783ed79d1f" containerID="1a0fba0263d2d23aed3014908bb57b46d4434eddf45ad1eb1da0cf9a68a9aa48" exitCode=0 Nov 22 04:28:33 crc kubenswrapper[4699]: I1122 04:28:33.319301 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-405a-account-create-grbrm" event={"ID":"4c24c357-369d-430b-a7ba-62783ed79d1f","Type":"ContainerDied","Data":"1a0fba0263d2d23aed3014908bb57b46d4434eddf45ad1eb1da0cf9a68a9aa48"} Nov 22 04:28:33 crc kubenswrapper[4699]: I1122 04:28:33.343895 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-mvxc4" event={"ID":"e987a5a1-15e5-43db-b896-d68d46cf841d","Type":"ContainerStarted","Data":"cfab03b2d1e06c62fabc7abebc1e033ddf08a783ea64fbe5827e4b5bb9e70dc4"} Nov 22 04:28:33 crc kubenswrapper[4699]: I1122 04:28:33.378677 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-mvxc4" podStartSLOduration=5.378652129 podStartE2EDuration="5.378652129s" podCreationTimestamp="2025-11-22 04:28:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:28:33.362990289 +0000 UTC m=+1264.705611476" watchObservedRunningTime="2025-11-22 04:28:33.378652129 +0000 UTC m=+1264.721273316" Nov 22 04:28:33 crc kubenswrapper[4699]: I1122 04:28:33.379488 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-vppfb" event={"ID":"adab83b4-5c89-4ecd-af55-56492c7421b3","Type":"ContainerStarted","Data":"d283314d07f101264e5c7c630f6a848fddb263bfcd705b67c211e4d5457414ca"} Nov 22 04:28:33 crc kubenswrapper[4699]: I1122 04:28:33.379543 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-vppfb" event={"ID":"adab83b4-5c89-4ecd-af55-56492c7421b3","Type":"ContainerStarted","Data":"dbf8ddf6e09c7ba7355c3ee08476df15752db01cde57611c09fa9bd16fafa89d"} Nov 22 04:28:33 crc kubenswrapper[4699]: I1122 04:28:33.386395 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-5945-account-create-kgfl7" event={"ID":"4d971429-cae7-4fed-9849-343ec7364f54","Type":"ContainerStarted","Data":"9d9ea24bfbc6519b1e377f935771807718439cd55362fcfd039944e4a8475657"} Nov 22 04:28:33 crc kubenswrapper[4699]: I1122 04:28:33.386463 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-5945-account-create-kgfl7" event={"ID":"4d971429-cae7-4fed-9849-343ec7364f54","Type":"ContainerStarted","Data":"f1453f97f2f6daeb88506fa5dd9b7bdae7b141769817d455d48bcf166c6d4209"} Nov 22 04:28:33 crc kubenswrapper[4699]: I1122 04:28:33.425463 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-vppfb" podStartSLOduration=5.425444343 podStartE2EDuration="5.425444343s" podCreationTimestamp="2025-11-22 04:28:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:28:33.394308498 +0000 UTC m=+1264.736929696" watchObservedRunningTime="2025-11-22 04:28:33.425444343 +0000 UTC m=+1264.768065540" Nov 22 04:28:33 crc kubenswrapper[4699]: I1122 04:28:33.459605 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-5945-account-create-kgfl7" podStartSLOduration=5.45954932 podStartE2EDuration="5.45954932s" podCreationTimestamp="2025-11-22 04:28:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:28:33.417387058 +0000 UTC m=+1264.760008245" watchObservedRunningTime="2025-11-22 04:28:33.45954932 +0000 UTC m=+1264.802170507" Nov 22 04:28:33 crc kubenswrapper[4699]: I1122 04:28:33.481212 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ba3e973-312b-4343-a4c7-c6ab4a412703" path="/var/lib/kubelet/pods/0ba3e973-312b-4343-a4c7-c6ab4a412703/volumes" Nov 22 04:28:33 crc kubenswrapper[4699]: I1122 04:28:33.699051 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-554db96b96-4xcnr" Nov 22 04:28:33 crc kubenswrapper[4699]: I1122 04:28:33.779261 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-794784977b-czv6j"] Nov 22 04:28:33 crc kubenswrapper[4699]: I1122 04:28:33.779541 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ironic-794784977b-czv6j" podUID="da52be58-8760-4d0a-866a-9eb3b47b2e8b" containerName="ironic-api-log" containerID="cri-o://8aa8d7aa239f9ad34eddcdcab904b5d20b5462a1c20e76b19bcaea7b9580021e" gracePeriod=60 Nov 22 04:28:34 crc kubenswrapper[4699]: I1122 04:28:34.401755 4699 generic.go:334] "Generic (PLEG): container finished" podID="4d971429-cae7-4fed-9849-343ec7364f54" containerID="9d9ea24bfbc6519b1e377f935771807718439cd55362fcfd039944e4a8475657" exitCode=0 Nov 22 04:28:34 crc kubenswrapper[4699]: I1122 04:28:34.401842 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-5945-account-create-kgfl7" event={"ID":"4d971429-cae7-4fed-9849-343ec7364f54","Type":"ContainerDied","Data":"9d9ea24bfbc6519b1e377f935771807718439cd55362fcfd039944e4a8475657"} Nov 22 04:28:34 crc kubenswrapper[4699]: I1122 04:28:34.405173 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"832e24bf-7e8a-4e3f-be16-acd8ecff139d","Type":"ContainerStarted","Data":"762caf951095c0a59e1aa64fb5ae81b24e27728a4e00654e262f5af667f931a9"} Nov 22 04:28:34 crc kubenswrapper[4699]: I1122 04:28:34.408290 4699 generic.go:334] "Generic (PLEG): container finished" podID="da52be58-8760-4d0a-866a-9eb3b47b2e8b" containerID="8aa8d7aa239f9ad34eddcdcab904b5d20b5462a1c20e76b19bcaea7b9580021e" exitCode=143 Nov 22 04:28:34 crc kubenswrapper[4699]: I1122 04:28:34.408373 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-794784977b-czv6j" event={"ID":"da52be58-8760-4d0a-866a-9eb3b47b2e8b","Type":"ContainerDied","Data":"8aa8d7aa239f9ad34eddcdcab904b5d20b5462a1c20e76b19bcaea7b9580021e"} Nov 22 04:28:34 crc kubenswrapper[4699]: I1122 04:28:34.410612 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-mvxc4" event={"ID":"e987a5a1-15e5-43db-b896-d68d46cf841d","Type":"ContainerStarted","Data":"1dc9b48af9a286ac801387a62e24cb3cbd88475a440c0aa202e97d76cb30adc2"} Nov 22 04:28:34 crc kubenswrapper[4699]: I1122 04:28:34.417321 4699 generic.go:334] "Generic (PLEG): container finished" podID="adab83b4-5c89-4ecd-af55-56492c7421b3" containerID="d283314d07f101264e5c7c630f6a848fddb263bfcd705b67c211e4d5457414ca" exitCode=0 Nov 22 04:28:34 crc kubenswrapper[4699]: I1122 04:28:34.417464 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-vppfb" event={"ID":"adab83b4-5c89-4ecd-af55-56492c7421b3","Type":"ContainerDied","Data":"d283314d07f101264e5c7c630f6a848fddb263bfcd705b67c211e4d5457414ca"} Nov 22 04:28:34 crc kubenswrapper[4699]: I1122 04:28:34.513663 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-neutron-agent-65957c9c4f-4rj2b" Nov 22 04:28:34 crc kubenswrapper[4699]: I1122 04:28:34.991312 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5ae7-account-create-5gj27" Nov 22 04:28:35 crc kubenswrapper[4699]: I1122 04:28:35.118272 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qf7bw" Nov 22 04:28:35 crc kubenswrapper[4699]: I1122 04:28:35.135927 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-405a-account-create-grbrm" Nov 22 04:28:35 crc kubenswrapper[4699]: I1122 04:28:35.142397 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2aab7b5b-e294-4282-a3b3-75d47c1e911d-operator-scripts\") pod \"2aab7b5b-e294-4282-a3b3-75d47c1e911d\" (UID: \"2aab7b5b-e294-4282-a3b3-75d47c1e911d\") " Nov 22 04:28:35 crc kubenswrapper[4699]: I1122 04:28:35.143670 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dhn2\" (UniqueName: \"kubernetes.io/projected/4c24c357-369d-430b-a7ba-62783ed79d1f-kube-api-access-9dhn2\") pod \"4c24c357-369d-430b-a7ba-62783ed79d1f\" (UID: \"4c24c357-369d-430b-a7ba-62783ed79d1f\") " Nov 22 04:28:35 crc kubenswrapper[4699]: I1122 04:28:35.143776 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0a913d6-76de-4f73-bc38-83471deabfdb-operator-scripts\") pod \"a0a913d6-76de-4f73-bc38-83471deabfdb\" (UID: \"a0a913d6-76de-4f73-bc38-83471deabfdb\") " Nov 22 04:28:35 crc kubenswrapper[4699]: I1122 04:28:35.143831 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c24c357-369d-430b-a7ba-62783ed79d1f-operator-scripts\") pod \"4c24c357-369d-430b-a7ba-62783ed79d1f\" (UID: \"4c24c357-369d-430b-a7ba-62783ed79d1f\") " Nov 22 04:28:35 crc kubenswrapper[4699]: I1122 04:28:35.144029 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rr8fl\" (UniqueName: \"kubernetes.io/projected/2aab7b5b-e294-4282-a3b3-75d47c1e911d-kube-api-access-rr8fl\") pod \"2aab7b5b-e294-4282-a3b3-75d47c1e911d\" (UID: \"2aab7b5b-e294-4282-a3b3-75d47c1e911d\") " Nov 22 04:28:35 crc kubenswrapper[4699]: I1122 04:28:35.143332 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2aab7b5b-e294-4282-a3b3-75d47c1e911d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2aab7b5b-e294-4282-a3b3-75d47c1e911d" (UID: "2aab7b5b-e294-4282-a3b3-75d47c1e911d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:28:35 crc kubenswrapper[4699]: I1122 04:28:35.144474 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8czpq\" (UniqueName: \"kubernetes.io/projected/a0a913d6-76de-4f73-bc38-83471deabfdb-kube-api-access-8czpq\") pod \"a0a913d6-76de-4f73-bc38-83471deabfdb\" (UID: \"a0a913d6-76de-4f73-bc38-83471deabfdb\") " Nov 22 04:28:35 crc kubenswrapper[4699]: I1122 04:28:35.145129 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0a913d6-76de-4f73-bc38-83471deabfdb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a0a913d6-76de-4f73-bc38-83471deabfdb" (UID: "a0a913d6-76de-4f73-bc38-83471deabfdb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:28:35 crc kubenswrapper[4699]: I1122 04:28:35.145644 4699 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2aab7b5b-e294-4282-a3b3-75d47c1e911d-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:35 crc kubenswrapper[4699]: I1122 04:28:35.145660 4699 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0a913d6-76de-4f73-bc38-83471deabfdb-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:35 crc kubenswrapper[4699]: I1122 04:28:35.145856 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c24c357-369d-430b-a7ba-62783ed79d1f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4c24c357-369d-430b-a7ba-62783ed79d1f" (UID: "4c24c357-369d-430b-a7ba-62783ed79d1f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:28:35 crc kubenswrapper[4699]: I1122 04:28:35.149783 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2aab7b5b-e294-4282-a3b3-75d47c1e911d-kube-api-access-rr8fl" (OuterVolumeSpecName: "kube-api-access-rr8fl") pod "2aab7b5b-e294-4282-a3b3-75d47c1e911d" (UID: "2aab7b5b-e294-4282-a3b3-75d47c1e911d"). InnerVolumeSpecName "kube-api-access-rr8fl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:28:35 crc kubenswrapper[4699]: I1122 04:28:35.149933 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c24c357-369d-430b-a7ba-62783ed79d1f-kube-api-access-9dhn2" (OuterVolumeSpecName: "kube-api-access-9dhn2") pod "4c24c357-369d-430b-a7ba-62783ed79d1f" (UID: "4c24c357-369d-430b-a7ba-62783ed79d1f"). InnerVolumeSpecName "kube-api-access-9dhn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:28:35 crc kubenswrapper[4699]: I1122 04:28:35.158196 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0a913d6-76de-4f73-bc38-83471deabfdb-kube-api-access-8czpq" (OuterVolumeSpecName: "kube-api-access-8czpq") pod "a0a913d6-76de-4f73-bc38-83471deabfdb" (UID: "a0a913d6-76de-4f73-bc38-83471deabfdb"). InnerVolumeSpecName "kube-api-access-8czpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:28:35 crc kubenswrapper[4699]: I1122 04:28:35.247033 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dhn2\" (UniqueName: \"kubernetes.io/projected/4c24c357-369d-430b-a7ba-62783ed79d1f-kube-api-access-9dhn2\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:35 crc kubenswrapper[4699]: I1122 04:28:35.247063 4699 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c24c357-369d-430b-a7ba-62783ed79d1f-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:35 crc kubenswrapper[4699]: I1122 04:28:35.247072 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rr8fl\" (UniqueName: \"kubernetes.io/projected/2aab7b5b-e294-4282-a3b3-75d47c1e911d-kube-api-access-rr8fl\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:35 crc kubenswrapper[4699]: I1122 04:28:35.247082 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8czpq\" (UniqueName: \"kubernetes.io/projected/a0a913d6-76de-4f73-bc38-83471deabfdb-kube-api-access-8czpq\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:35 crc kubenswrapper[4699]: I1122 04:28:35.434176 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-5ae7-account-create-5gj27" event={"ID":"2aab7b5b-e294-4282-a3b3-75d47c1e911d","Type":"ContainerDied","Data":"2568b7c01e3987b8dd763183ef3273ddc4b96a38ea315d505109ccbc630689fe"} Nov 22 04:28:35 crc kubenswrapper[4699]: I1122 04:28:35.434222 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2568b7c01e3987b8dd763183ef3273ddc4b96a38ea315d505109ccbc630689fe" Nov 22 04:28:35 crc kubenswrapper[4699]: I1122 04:28:35.434278 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5ae7-account-create-5gj27" Nov 22 04:28:35 crc kubenswrapper[4699]: I1122 04:28:35.446624 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5997b85577-gkwmz" event={"ID":"f3eaea68-e2a0-4b59-961e-eebded9815b1","Type":"ContainerStarted","Data":"a9b0ff8bb8cb4e9705e2ec82e832e57e088146c5e6aaa33716b7be01531da903"} Nov 22 04:28:35 crc kubenswrapper[4699]: I1122 04:28:35.455380 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-405a-account-create-grbrm" Nov 22 04:28:35 crc kubenswrapper[4699]: I1122 04:28:35.460280 4699 generic.go:334] "Generic (PLEG): container finished" podID="e987a5a1-15e5-43db-b896-d68d46cf841d" containerID="1dc9b48af9a286ac801387a62e24cb3cbd88475a440c0aa202e97d76cb30adc2" exitCode=0 Nov 22 04:28:35 crc kubenswrapper[4699]: I1122 04:28:35.467000 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-405a-account-create-grbrm" event={"ID":"4c24c357-369d-430b-a7ba-62783ed79d1f","Type":"ContainerDied","Data":"1f247118571ce12d62e5a96857539c78a48316ca4ab2aedd11f0aa1c497ea3aa"} Nov 22 04:28:35 crc kubenswrapper[4699]: I1122 04:28:35.467039 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f247118571ce12d62e5a96857539c78a48316ca4ab2aedd11f0aa1c497ea3aa" Nov 22 04:28:35 crc kubenswrapper[4699]: I1122 04:28:35.467058 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-mvxc4" event={"ID":"e987a5a1-15e5-43db-b896-d68d46cf841d","Type":"ContainerDied","Data":"1dc9b48af9a286ac801387a62e24cb3cbd88475a440c0aa202e97d76cb30adc2"} Nov 22 04:28:35 crc kubenswrapper[4699]: I1122 04:28:35.474601 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qf7bw" Nov 22 04:28:35 crc kubenswrapper[4699]: I1122 04:28:35.474749 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qf7bw" event={"ID":"a0a913d6-76de-4f73-bc38-83471deabfdb","Type":"ContainerDied","Data":"b0979dcc665e39c4ed80d4e6dc4a13a4a6ac74dad00a16eb5cebb30f54d10b8e"} Nov 22 04:28:35 crc kubenswrapper[4699]: I1122 04:28:35.474848 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0979dcc665e39c4ed80d4e6dc4a13a4a6ac74dad00a16eb5cebb30f54d10b8e" Nov 22 04:28:36 crc kubenswrapper[4699]: I1122 04:28:36.485845 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5997b85577-gkwmz" Nov 22 04:28:36 crc kubenswrapper[4699]: I1122 04:28:36.486184 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5997b85577-gkwmz" Nov 22 04:28:36 crc kubenswrapper[4699]: I1122 04:28:36.509265 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5997b85577-gkwmz" podStartSLOduration=10.509248849 podStartE2EDuration="10.509248849s" podCreationTimestamp="2025-11-22 04:28:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:28:36.507134128 +0000 UTC m=+1267.849755315" watchObservedRunningTime="2025-11-22 04:28:36.509248849 +0000 UTC m=+1267.851870036" Nov 22 04:28:37 crc kubenswrapper[4699]: I1122 04:28:37.422345 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 04:28:37 crc kubenswrapper[4699]: I1122 04:28:37.497869 4699 generic.go:334] "Generic (PLEG): container finished" podID="474af2c7-c72f-4420-94a9-4876e0dbd68e" containerID="da2cb93d6c05afd426db6f8dd4d552a0f13e61dab834856411f3ac66c81de07a" exitCode=1 Nov 22 04:28:37 crc kubenswrapper[4699]: I1122 04:28:37.497981 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-65957c9c4f-4rj2b" event={"ID":"474af2c7-c72f-4420-94a9-4876e0dbd68e","Type":"ContainerDied","Data":"da2cb93d6c05afd426db6f8dd4d552a0f13e61dab834856411f3ac66c81de07a"} Nov 22 04:28:37 crc kubenswrapper[4699]: I1122 04:28:37.498069 4699 scope.go:117] "RemoveContainer" containerID="a505574fc6c1c81ade845e7ecd7a3e86581538f29889bd115174e6acfe5dbf52" Nov 22 04:28:37 crc kubenswrapper[4699]: I1122 04:28:37.499635 4699 scope.go:117] "RemoveContainer" containerID="da2cb93d6c05afd426db6f8dd4d552a0f13e61dab834856411f3ac66c81de07a" Nov 22 04:28:37 crc kubenswrapper[4699]: E1122 04:28:37.499957 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-neutron-agent\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-neutron-agent pod=ironic-neutron-agent-65957c9c4f-4rj2b_openstack(474af2c7-c72f-4420-94a9-4876e0dbd68e)\"" pod="openstack/ironic-neutron-agent-65957c9c4f-4rj2b" podUID="474af2c7-c72f-4420-94a9-4876e0dbd68e" Nov 22 04:28:38 crc kubenswrapper[4699]: I1122 04:28:38.537586 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5997b85577-gkwmz" Nov 22 04:28:38 crc kubenswrapper[4699]: I1122 04:28:38.726107 4699 patch_prober.go:28] interesting pod/machine-config-daemon-kjwnt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:28:38 crc kubenswrapper[4699]: I1122 04:28:38.726172 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:28:38 crc kubenswrapper[4699]: I1122 04:28:38.726219 4699 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" Nov 22 04:28:38 crc kubenswrapper[4699]: I1122 04:28:38.727141 4699 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6069541dbe3b036cc4c74183802ec26cdc4e0a14a8ff9d64a37a60b66cc8ee5b"} pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 04:28:38 crc kubenswrapper[4699]: I1122 04:28:38.727203 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerName="machine-config-daemon" containerID="cri-o://6069541dbe3b036cc4c74183802ec26cdc4e0a14a8ff9d64a37a60b66cc8ee5b" gracePeriod=600 Nov 22 04:28:39 crc kubenswrapper[4699]: I1122 04:28:39.163659 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wjmlk"] Nov 22 04:28:39 crc kubenswrapper[4699]: E1122 04:28:39.164084 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c24c357-369d-430b-a7ba-62783ed79d1f" containerName="mariadb-account-create" Nov 22 04:28:39 crc kubenswrapper[4699]: I1122 04:28:39.164104 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c24c357-369d-430b-a7ba-62783ed79d1f" containerName="mariadb-account-create" Nov 22 04:28:39 crc kubenswrapper[4699]: E1122 04:28:39.164117 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0a913d6-76de-4f73-bc38-83471deabfdb" containerName="mariadb-database-create" Nov 22 04:28:39 crc kubenswrapper[4699]: I1122 04:28:39.164124 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0a913d6-76de-4f73-bc38-83471deabfdb" containerName="mariadb-database-create" Nov 22 04:28:39 crc kubenswrapper[4699]: E1122 04:28:39.164161 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aab7b5b-e294-4282-a3b3-75d47c1e911d" containerName="mariadb-account-create" Nov 22 04:28:39 crc kubenswrapper[4699]: I1122 04:28:39.164167 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aab7b5b-e294-4282-a3b3-75d47c1e911d" containerName="mariadb-account-create" Nov 22 04:28:39 crc kubenswrapper[4699]: I1122 04:28:39.164350 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="2aab7b5b-e294-4282-a3b3-75d47c1e911d" containerName="mariadb-account-create" Nov 22 04:28:39 crc kubenswrapper[4699]: I1122 04:28:39.164366 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0a913d6-76de-4f73-bc38-83471deabfdb" containerName="mariadb-database-create" Nov 22 04:28:39 crc kubenswrapper[4699]: I1122 04:28:39.164383 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c24c357-369d-430b-a7ba-62783ed79d1f" containerName="mariadb-account-create" Nov 22 04:28:39 crc kubenswrapper[4699]: I1122 04:28:39.165018 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-wjmlk" Nov 22 04:28:39 crc kubenswrapper[4699]: I1122 04:28:39.167318 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 22 04:28:39 crc kubenswrapper[4699]: I1122 04:28:39.170647 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Nov 22 04:28:39 crc kubenswrapper[4699]: I1122 04:28:39.171569 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-vvxqx" Nov 22 04:28:39 crc kubenswrapper[4699]: I1122 04:28:39.182042 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wjmlk"] Nov 22 04:28:39 crc kubenswrapper[4699]: I1122 04:28:39.263137 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2543fbe9-12e0-40d5-8474-dab6ed3144be-scripts\") pod \"nova-cell0-conductor-db-sync-wjmlk\" (UID: \"2543fbe9-12e0-40d5-8474-dab6ed3144be\") " pod="openstack/nova-cell0-conductor-db-sync-wjmlk" Nov 22 04:28:39 crc kubenswrapper[4699]: I1122 04:28:39.263186 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2543fbe9-12e0-40d5-8474-dab6ed3144be-config-data\") pod \"nova-cell0-conductor-db-sync-wjmlk\" (UID: \"2543fbe9-12e0-40d5-8474-dab6ed3144be\") " pod="openstack/nova-cell0-conductor-db-sync-wjmlk" Nov 22 04:28:39 crc kubenswrapper[4699]: I1122 04:28:39.263211 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fl98\" (UniqueName: \"kubernetes.io/projected/2543fbe9-12e0-40d5-8474-dab6ed3144be-kube-api-access-9fl98\") pod \"nova-cell0-conductor-db-sync-wjmlk\" (UID: \"2543fbe9-12e0-40d5-8474-dab6ed3144be\") " pod="openstack/nova-cell0-conductor-db-sync-wjmlk" Nov 22 04:28:39 crc kubenswrapper[4699]: I1122 04:28:39.263265 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2543fbe9-12e0-40d5-8474-dab6ed3144be-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-wjmlk\" (UID: \"2543fbe9-12e0-40d5-8474-dab6ed3144be\") " pod="openstack/nova-cell0-conductor-db-sync-wjmlk" Nov 22 04:28:39 crc kubenswrapper[4699]: I1122 04:28:39.365164 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2543fbe9-12e0-40d5-8474-dab6ed3144be-scripts\") pod \"nova-cell0-conductor-db-sync-wjmlk\" (UID: \"2543fbe9-12e0-40d5-8474-dab6ed3144be\") " pod="openstack/nova-cell0-conductor-db-sync-wjmlk" Nov 22 04:28:39 crc kubenswrapper[4699]: I1122 04:28:39.365233 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2543fbe9-12e0-40d5-8474-dab6ed3144be-config-data\") pod \"nova-cell0-conductor-db-sync-wjmlk\" (UID: \"2543fbe9-12e0-40d5-8474-dab6ed3144be\") " pod="openstack/nova-cell0-conductor-db-sync-wjmlk" Nov 22 04:28:39 crc kubenswrapper[4699]: I1122 04:28:39.365267 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fl98\" (UniqueName: \"kubernetes.io/projected/2543fbe9-12e0-40d5-8474-dab6ed3144be-kube-api-access-9fl98\") pod \"nova-cell0-conductor-db-sync-wjmlk\" (UID: \"2543fbe9-12e0-40d5-8474-dab6ed3144be\") " pod="openstack/nova-cell0-conductor-db-sync-wjmlk" Nov 22 04:28:39 crc kubenswrapper[4699]: I1122 04:28:39.365343 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2543fbe9-12e0-40d5-8474-dab6ed3144be-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-wjmlk\" (UID: \"2543fbe9-12e0-40d5-8474-dab6ed3144be\") " pod="openstack/nova-cell0-conductor-db-sync-wjmlk" Nov 22 04:28:39 crc kubenswrapper[4699]: I1122 04:28:39.372454 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2543fbe9-12e0-40d5-8474-dab6ed3144be-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-wjmlk\" (UID: \"2543fbe9-12e0-40d5-8474-dab6ed3144be\") " pod="openstack/nova-cell0-conductor-db-sync-wjmlk" Nov 22 04:28:39 crc kubenswrapper[4699]: I1122 04:28:39.374739 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2543fbe9-12e0-40d5-8474-dab6ed3144be-scripts\") pod \"nova-cell0-conductor-db-sync-wjmlk\" (UID: \"2543fbe9-12e0-40d5-8474-dab6ed3144be\") " pod="openstack/nova-cell0-conductor-db-sync-wjmlk" Nov 22 04:28:39 crc kubenswrapper[4699]: I1122 04:28:39.376066 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2543fbe9-12e0-40d5-8474-dab6ed3144be-config-data\") pod \"nova-cell0-conductor-db-sync-wjmlk\" (UID: \"2543fbe9-12e0-40d5-8474-dab6ed3144be\") " pod="openstack/nova-cell0-conductor-db-sync-wjmlk" Nov 22 04:28:39 crc kubenswrapper[4699]: I1122 04:28:39.392323 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fl98\" (UniqueName: \"kubernetes.io/projected/2543fbe9-12e0-40d5-8474-dab6ed3144be-kube-api-access-9fl98\") pod \"nova-cell0-conductor-db-sync-wjmlk\" (UID: \"2543fbe9-12e0-40d5-8474-dab6ed3144be\") " pod="openstack/nova-cell0-conductor-db-sync-wjmlk" Nov 22 04:28:39 crc kubenswrapper[4699]: I1122 04:28:39.499337 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-wjmlk" Nov 22 04:28:39 crc kubenswrapper[4699]: I1122 04:28:39.527958 4699 generic.go:334] "Generic (PLEG): container finished" podID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerID="6069541dbe3b036cc4c74183802ec26cdc4e0a14a8ff9d64a37a60b66cc8ee5b" exitCode=0 Nov 22 04:28:39 crc kubenswrapper[4699]: I1122 04:28:39.528881 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" event={"ID":"41bdbae2-706a-4f84-9f56-5a42aec77762","Type":"ContainerDied","Data":"6069541dbe3b036cc4c74183802ec26cdc4e0a14a8ff9d64a37a60b66cc8ee5b"} Nov 22 04:28:40 crc kubenswrapper[4699]: I1122 04:28:40.516268 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-65957c9c4f-4rj2b" Nov 22 04:28:40 crc kubenswrapper[4699]: I1122 04:28:40.516694 4699 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ironic-neutron-agent-65957c9c4f-4rj2b" Nov 22 04:28:40 crc kubenswrapper[4699]: I1122 04:28:40.517534 4699 scope.go:117] "RemoveContainer" containerID="da2cb93d6c05afd426db6f8dd4d552a0f13e61dab834856411f3ac66c81de07a" Nov 22 04:28:40 crc kubenswrapper[4699]: E1122 04:28:40.517799 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-neutron-agent\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-neutron-agent pod=ironic-neutron-agent-65957c9c4f-4rj2b_openstack(474af2c7-c72f-4420-94a9-4876e0dbd68e)\"" pod="openstack/ironic-neutron-agent-65957c9c4f-4rj2b" podUID="474af2c7-c72f-4420-94a9-4876e0dbd68e" Nov 22 04:28:42 crc kubenswrapper[4699]: I1122 04:28:42.178013 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5997b85577-gkwmz" Nov 22 04:28:44 crc kubenswrapper[4699]: I1122 04:28:44.708005 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6f6d546c9b-wrks9" Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.200159 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-794784977b-czv6j" Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.203304 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vppfb" Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.212685 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-5945-account-create-kgfl7" Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.219306 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-mvxc4" Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.297344 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-566cbdbc45-ld9jb" Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.332257 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da52be58-8760-4d0a-866a-9eb3b47b2e8b-combined-ca-bundle\") pod \"da52be58-8760-4d0a-866a-9eb3b47b2e8b\" (UID: \"da52be58-8760-4d0a-866a-9eb3b47b2e8b\") " Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.332328 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da52be58-8760-4d0a-866a-9eb3b47b2e8b-config-data-custom\") pod \"da52be58-8760-4d0a-866a-9eb3b47b2e8b\" (UID: \"da52be58-8760-4d0a-866a-9eb3b47b2e8b\") " Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.332377 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/da52be58-8760-4d0a-866a-9eb3b47b2e8b-etc-podinfo\") pod \"da52be58-8760-4d0a-866a-9eb3b47b2e8b\" (UID: \"da52be58-8760-4d0a-866a-9eb3b47b2e8b\") " Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.332487 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfqzd\" (UniqueName: \"kubernetes.io/projected/adab83b4-5c89-4ecd-af55-56492c7421b3-kube-api-access-nfqzd\") pod \"adab83b4-5c89-4ecd-af55-56492c7421b3\" (UID: \"adab83b4-5c89-4ecd-af55-56492c7421b3\") " Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.332515 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wp4f8\" (UniqueName: \"kubernetes.io/projected/4d971429-cae7-4fed-9849-343ec7364f54-kube-api-access-wp4f8\") pod \"4d971429-cae7-4fed-9849-343ec7364f54\" (UID: \"4d971429-cae7-4fed-9849-343ec7364f54\") " Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.332559 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e987a5a1-15e5-43db-b896-d68d46cf841d-operator-scripts\") pod \"e987a5a1-15e5-43db-b896-d68d46cf841d\" (UID: \"e987a5a1-15e5-43db-b896-d68d46cf841d\") " Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.332638 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scgt2\" (UniqueName: \"kubernetes.io/projected/da52be58-8760-4d0a-866a-9eb3b47b2e8b-kube-api-access-scgt2\") pod \"da52be58-8760-4d0a-866a-9eb3b47b2e8b\" (UID: \"da52be58-8760-4d0a-866a-9eb3b47b2e8b\") " Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.332677 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da52be58-8760-4d0a-866a-9eb3b47b2e8b-logs\") pod \"da52be58-8760-4d0a-866a-9eb3b47b2e8b\" (UID: \"da52be58-8760-4d0a-866a-9eb3b47b2e8b\") " Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.332707 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adab83b4-5c89-4ecd-af55-56492c7421b3-operator-scripts\") pod \"adab83b4-5c89-4ecd-af55-56492c7421b3\" (UID: \"adab83b4-5c89-4ecd-af55-56492c7421b3\") " Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.332772 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da52be58-8760-4d0a-866a-9eb3b47b2e8b-scripts\") pod \"da52be58-8760-4d0a-866a-9eb3b47b2e8b\" (UID: \"da52be58-8760-4d0a-866a-9eb3b47b2e8b\") " Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.332828 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da52be58-8760-4d0a-866a-9eb3b47b2e8b-config-data\") pod \"da52be58-8760-4d0a-866a-9eb3b47b2e8b\" (UID: \"da52be58-8760-4d0a-866a-9eb3b47b2e8b\") " Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.332849 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/da52be58-8760-4d0a-866a-9eb3b47b2e8b-config-data-merged\") pod \"da52be58-8760-4d0a-866a-9eb3b47b2e8b\" (UID: \"da52be58-8760-4d0a-866a-9eb3b47b2e8b\") " Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.332906 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d971429-cae7-4fed-9849-343ec7364f54-operator-scripts\") pod \"4d971429-cae7-4fed-9849-343ec7364f54\" (UID: \"4d971429-cae7-4fed-9849-343ec7364f54\") " Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.332938 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdmmm\" (UniqueName: \"kubernetes.io/projected/e987a5a1-15e5-43db-b896-d68d46cf841d-kube-api-access-kdmmm\") pod \"e987a5a1-15e5-43db-b896-d68d46cf841d\" (UID: \"e987a5a1-15e5-43db-b896-d68d46cf841d\") " Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.342616 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adab83b4-5c89-4ecd-af55-56492c7421b3-kube-api-access-nfqzd" (OuterVolumeSpecName: "kube-api-access-nfqzd") pod "adab83b4-5c89-4ecd-af55-56492c7421b3" (UID: "adab83b4-5c89-4ecd-af55-56492c7421b3"). InnerVolumeSpecName "kube-api-access-nfqzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.344934 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e987a5a1-15e5-43db-b896-d68d46cf841d-kube-api-access-kdmmm" (OuterVolumeSpecName: "kube-api-access-kdmmm") pod "e987a5a1-15e5-43db-b896-d68d46cf841d" (UID: "e987a5a1-15e5-43db-b896-d68d46cf841d"). InnerVolumeSpecName "kube-api-access-kdmmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.345454 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da52be58-8760-4d0a-866a-9eb3b47b2e8b-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "da52be58-8760-4d0a-866a-9eb3b47b2e8b" (UID: "da52be58-8760-4d0a-866a-9eb3b47b2e8b"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.346157 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da52be58-8760-4d0a-866a-9eb3b47b2e8b-logs" (OuterVolumeSpecName: "logs") pod "da52be58-8760-4d0a-866a-9eb3b47b2e8b" (UID: "da52be58-8760-4d0a-866a-9eb3b47b2e8b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.346784 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/adab83b4-5c89-4ecd-af55-56492c7421b3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "adab83b4-5c89-4ecd-af55-56492c7421b3" (UID: "adab83b4-5c89-4ecd-af55-56492c7421b3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.347118 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da52be58-8760-4d0a-866a-9eb3b47b2e8b-kube-api-access-scgt2" (OuterVolumeSpecName: "kube-api-access-scgt2") pod "da52be58-8760-4d0a-866a-9eb3b47b2e8b" (UID: "da52be58-8760-4d0a-866a-9eb3b47b2e8b"). InnerVolumeSpecName "kube-api-access-scgt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.347781 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e987a5a1-15e5-43db-b896-d68d46cf841d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e987a5a1-15e5-43db-b896-d68d46cf841d" (UID: "e987a5a1-15e5-43db-b896-d68d46cf841d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.350186 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d971429-cae7-4fed-9849-343ec7364f54-kube-api-access-wp4f8" (OuterVolumeSpecName: "kube-api-access-wp4f8") pod "4d971429-cae7-4fed-9849-343ec7364f54" (UID: "4d971429-cae7-4fed-9849-343ec7364f54"). InnerVolumeSpecName "kube-api-access-wp4f8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.351448 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d971429-cae7-4fed-9849-343ec7364f54-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4d971429-cae7-4fed-9849-343ec7364f54" (UID: "4d971429-cae7-4fed-9849-343ec7364f54"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.366636 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da52be58-8760-4d0a-866a-9eb3b47b2e8b-scripts" (OuterVolumeSpecName: "scripts") pod "da52be58-8760-4d0a-866a-9eb3b47b2e8b" (UID: "da52be58-8760-4d0a-866a-9eb3b47b2e8b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.374881 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da52be58-8760-4d0a-866a-9eb3b47b2e8b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "da52be58-8760-4d0a-866a-9eb3b47b2e8b" (UID: "da52be58-8760-4d0a-866a-9eb3b47b2e8b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.405602 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6f6d546c9b-wrks9"] Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.406105 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6f6d546c9b-wrks9" podUID="d112a61a-4828-4d29-b47d-ee894ca24784" containerName="neutron-api" containerID="cri-o://7e1fcbb58973272b444fef351419c088db0c22f36b8210be107197b7f9bc8eaa" gracePeriod=30 Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.406356 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6f6d546c9b-wrks9" podUID="d112a61a-4828-4d29-b47d-ee894ca24784" containerName="neutron-httpd" containerID="cri-o://93774c3e979cfc6002da894382cdd76ae9a2684d25c31d0ab44f93df7f619464" gracePeriod=30 Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.418754 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/da52be58-8760-4d0a-866a-9eb3b47b2e8b-etc-podinfo" (OuterVolumeSpecName: "etc-podinfo") pod "da52be58-8760-4d0a-866a-9eb3b47b2e8b" (UID: "da52be58-8760-4d0a-866a-9eb3b47b2e8b"). InnerVolumeSpecName "etc-podinfo". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.442942 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scgt2\" (UniqueName: \"kubernetes.io/projected/da52be58-8760-4d0a-866a-9eb3b47b2e8b-kube-api-access-scgt2\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.442981 4699 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da52be58-8760-4d0a-866a-9eb3b47b2e8b-logs\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.442993 4699 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adab83b4-5c89-4ecd-af55-56492c7421b3-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.443003 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da52be58-8760-4d0a-866a-9eb3b47b2e8b-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.443014 4699 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/da52be58-8760-4d0a-866a-9eb3b47b2e8b-config-data-merged\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.443026 4699 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d971429-cae7-4fed-9849-343ec7364f54-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.443037 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdmmm\" (UniqueName: \"kubernetes.io/projected/e987a5a1-15e5-43db-b896-d68d46cf841d-kube-api-access-kdmmm\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.443048 4699 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da52be58-8760-4d0a-866a-9eb3b47b2e8b-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.443059 4699 reconciler_common.go:293] "Volume detached for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/da52be58-8760-4d0a-866a-9eb3b47b2e8b-etc-podinfo\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.443071 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfqzd\" (UniqueName: \"kubernetes.io/projected/adab83b4-5c89-4ecd-af55-56492c7421b3-kube-api-access-nfqzd\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.443083 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wp4f8\" (UniqueName: \"kubernetes.io/projected/4d971429-cae7-4fed-9849-343ec7364f54-kube-api-access-wp4f8\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.443095 4699 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e987a5a1-15e5-43db-b896-d68d46cf841d-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.480618 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da52be58-8760-4d0a-866a-9eb3b47b2e8b-config-data" (OuterVolumeSpecName: "config-data") pod "da52be58-8760-4d0a-866a-9eb3b47b2e8b" (UID: "da52be58-8760-4d0a-866a-9eb3b47b2e8b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.534589 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da52be58-8760-4d0a-866a-9eb3b47b2e8b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da52be58-8760-4d0a-866a-9eb3b47b2e8b" (UID: "da52be58-8760-4d0a-866a-9eb3b47b2e8b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.544590 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da52be58-8760-4d0a-866a-9eb3b47b2e8b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.544627 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da52be58-8760-4d0a-866a-9eb3b47b2e8b-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.603798 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-mvxc4" Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.606495 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vppfb" Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.607832 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-mvxc4" event={"ID":"e987a5a1-15e5-43db-b896-d68d46cf841d","Type":"ContainerDied","Data":"cfab03b2d1e06c62fabc7abebc1e033ddf08a783ea64fbe5827e4b5bb9e70dc4"} Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.607868 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cfab03b2d1e06c62fabc7abebc1e033ddf08a783ea64fbe5827e4b5bb9e70dc4" Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.607878 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-vppfb" event={"ID":"adab83b4-5c89-4ecd-af55-56492c7421b3","Type":"ContainerDied","Data":"dbf8ddf6e09c7ba7355c3ee08476df15752db01cde57611c09fa9bd16fafa89d"} Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.607890 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbf8ddf6e09c7ba7355c3ee08476df15752db01cde57611c09fa9bd16fafa89d" Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.609971 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-5945-account-create-kgfl7" event={"ID":"4d971429-cae7-4fed-9849-343ec7364f54","Type":"ContainerDied","Data":"f1453f97f2f6daeb88506fa5dd9b7bdae7b141769817d455d48bcf166c6d4209"} Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.610002 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1453f97f2f6daeb88506fa5dd9b7bdae7b141769817d455d48bcf166c6d4209" Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.610054 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-5945-account-create-kgfl7" Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.622192 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-794784977b-czv6j" event={"ID":"da52be58-8760-4d0a-866a-9eb3b47b2e8b","Type":"ContainerDied","Data":"a32075ab60ca7e8d40f944859afbe3afb51f7afae7ff399b48dd5462e1558611"} Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.622340 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-794784977b-czv6j" Nov 22 04:28:47 crc kubenswrapper[4699]: E1122 04:28:47.686619 4699 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified" Nov 22 04:28:47 crc kubenswrapper[4699]: E1122 04:28:47.687532 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:openstackclient,Image:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,Command:[/bin/sleep],Args:[infinity],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68dh5c9h684h8dh67dhd7h95h695h4h57bh67h59ch5b4h55h675h54fh64chch65dhb6h5ddh686h689h78h5f6h674hb9hb7h586h5d5h5f7h564q,ValueFrom:nil,},EnvVar{Name:OS_CLOUD,Value:default,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_HOST,Value:metric-storage-prometheus.openstack.svc,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_PORT,Value:9090,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:openstack-config,ReadOnly:false,MountPath:/home/cloud-admin/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/home/cloud-admin/.config/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/home/cloud-admin/cloudrc,SubPath:cloudrc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xdc9s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42401,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42401,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstackclient_openstack(b3f3d84b-ad88-4145-9e18-b2baa8eff9c4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 04:28:47 crc kubenswrapper[4699]: E1122 04:28:47.688752 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstackclient\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstackclient" podUID="b3f3d84b-ad88-4145-9e18-b2baa8eff9c4" Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.719484 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-794784977b-czv6j"] Nov 22 04:28:47 crc kubenswrapper[4699]: I1122 04:28:47.737602 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-794784977b-czv6j"] Nov 22 04:28:48 crc kubenswrapper[4699]: I1122 04:28:48.638814 4699 generic.go:334] "Generic (PLEG): container finished" podID="d112a61a-4828-4d29-b47d-ee894ca24784" containerID="93774c3e979cfc6002da894382cdd76ae9a2684d25c31d0ab44f93df7f619464" exitCode=0 Nov 22 04:28:48 crc kubenswrapper[4699]: I1122 04:28:48.638894 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f6d546c9b-wrks9" event={"ID":"d112a61a-4828-4d29-b47d-ee894ca24784","Type":"ContainerDied","Data":"93774c3e979cfc6002da894382cdd76ae9a2684d25c31d0ab44f93df7f619464"} Nov 22 04:28:48 crc kubenswrapper[4699]: E1122 04:28:48.640552 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstackclient\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified\\\"\"" pod="openstack/openstackclient" podUID="b3f3d84b-ad88-4145-9e18-b2baa8eff9c4" Nov 22 04:28:49 crc kubenswrapper[4699]: I1122 04:28:49.466764 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da52be58-8760-4d0a-866a-9eb3b47b2e8b" path="/var/lib/kubelet/pods/da52be58-8760-4d0a-866a-9eb3b47b2e8b/volumes" Nov 22 04:28:49 crc kubenswrapper[4699]: I1122 04:28:49.649753 4699 generic.go:334] "Generic (PLEG): container finished" podID="d112a61a-4828-4d29-b47d-ee894ca24784" containerID="7e1fcbb58973272b444fef351419c088db0c22f36b8210be107197b7f9bc8eaa" exitCode=0 Nov 22 04:28:49 crc kubenswrapper[4699]: I1122 04:28:49.649801 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f6d546c9b-wrks9" event={"ID":"d112a61a-4828-4d29-b47d-ee894ca24784","Type":"ContainerDied","Data":"7e1fcbb58973272b444fef351419c088db0c22f36b8210be107197b7f9bc8eaa"} Nov 22 04:28:51 crc kubenswrapper[4699]: E1122 04:28:51.907591 4699 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/ironic-python-agent:current-podified" Nov 22 04:28:51 crc kubenswrapper[4699]: E1122 04:28:51.908283 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:ironic-python-agent-init,Image:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DEST_DIR,Value:/var/lib/ironic/httpboot,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-merged,ReadOnly:false,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-podinfo,ReadOnly:false,MountPath:/etc/podinfo,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-lib-ironic,ReadOnly:false,MountPath:/var/lib/ironic,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-custom,ReadOnly:true,MountPath:/var/lib/config-data/custom,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d5nj9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-conductor-0_openstack(6b0a42c8-e8a1-45b3-9f29-77459d98ea4d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 04:28:51 crc kubenswrapper[4699]: E1122 04:28:51.909750 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-python-agent-init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ironic-conductor-0" podUID="6b0a42c8-e8a1-45b3-9f29-77459d98ea4d" Nov 22 04:28:52 crc kubenswrapper[4699]: I1122 04:28:52.447756 4699 scope.go:117] "RemoveContainer" containerID="da2cb93d6c05afd426db6f8dd4d552a0f13e61dab834856411f3ac66c81de07a" Nov 22 04:28:52 crc kubenswrapper[4699]: E1122 04:28:52.591837 4699 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified" Nov 22 04:28:52 crc kubenswrapper[4699]: E1122 04:28:52.599629 4699 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ironic-inspector-db-sync,Image:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /bin/bash -c 'ironic-inspector-dbsync --config-file /etc/ironic-inspector/inspector.conf --config-dir /etc/ironic-inspector/inspector.conf.d upgrade'],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-lib-ironic,ReadOnly:false,MountPath:/var/lib/ironic,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-lib-ironic-inspector-dhcp-hostsdir,ReadOnly:false,MountPath:/var/lib/ironic-inspector/dhcp-hostsdir,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-podinfo,ReadOnly:false,MountPath:/etc/podinfo,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5zdsp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-inspector-db-sync-kdn8x_openstack(57ead407-5bf6-4cc4-ac17-e939d329f220): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 04:28:52 crc kubenswrapper[4699]: E1122 04:28:52.600886 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-inspector-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ironic-inspector-db-sync-kdn8x" podUID="57ead407-5bf6-4cc4-ac17-e939d329f220" Nov 22 04:28:52 crc kubenswrapper[4699]: I1122 04:28:52.611174 4699 scope.go:117] "RemoveContainer" containerID="860a4fc3095846d1c30f6bfb9c79f3b411c14f316e6ed54ad090c3a0186b2e5c" Nov 22 04:28:52 crc kubenswrapper[4699]: I1122 04:28:52.691081 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f6d546c9b-wrks9" event={"ID":"d112a61a-4828-4d29-b47d-ee894ca24784","Type":"ContainerDied","Data":"983884d9b2b360e2a9acf79e9de2759b9dad21849d254cb14e57da4bf5e190d0"} Nov 22 04:28:52 crc kubenswrapper[4699]: I1122 04:28:52.691362 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="983884d9b2b360e2a9acf79e9de2759b9dad21849d254cb14e57da4bf5e190d0" Nov 22 04:28:52 crc kubenswrapper[4699]: E1122 04:28:52.692884 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-python-agent-init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/ironic-python-agent:current-podified\\\"\"" pod="openstack/ironic-conductor-0" podUID="6b0a42c8-e8a1-45b3-9f29-77459d98ea4d" Nov 22 04:28:52 crc kubenswrapper[4699]: E1122 04:28:52.693327 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-inspector-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified\\\"\"" pod="openstack/ironic-inspector-db-sync-kdn8x" podUID="57ead407-5bf6-4cc4-ac17-e939d329f220" Nov 22 04:28:52 crc kubenswrapper[4699]: I1122 04:28:52.766765 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f6d546c9b-wrks9" Nov 22 04:28:52 crc kubenswrapper[4699]: I1122 04:28:52.786733 4699 scope.go:117] "RemoveContainer" containerID="46bc820d607b57705b9d4e19c057ad7b8276c2f5d3ecc5eae1027faece236c71" Nov 22 04:28:52 crc kubenswrapper[4699]: I1122 04:28:52.904869 4699 scope.go:117] "RemoveContainer" containerID="8aa8d7aa239f9ad34eddcdcab904b5d20b5462a1c20e76b19bcaea7b9580021e" Nov 22 04:28:52 crc kubenswrapper[4699]: I1122 04:28:52.944850 4699 scope.go:117] "RemoveContainer" containerID="7ed953b0d80e9df6281772ea99e9ca5db990127d1c62e1cddc841220e453adca" Nov 22 04:28:52 crc kubenswrapper[4699]: I1122 04:28:52.961872 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d112a61a-4828-4d29-b47d-ee894ca24784-combined-ca-bundle\") pod \"d112a61a-4828-4d29-b47d-ee894ca24784\" (UID: \"d112a61a-4828-4d29-b47d-ee894ca24784\") " Nov 22 04:28:52 crc kubenswrapper[4699]: I1122 04:28:52.962267 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d112a61a-4828-4d29-b47d-ee894ca24784-config\") pod \"d112a61a-4828-4d29-b47d-ee894ca24784\" (UID: \"d112a61a-4828-4d29-b47d-ee894ca24784\") " Nov 22 04:28:52 crc kubenswrapper[4699]: I1122 04:28:52.962292 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7q6g\" (UniqueName: \"kubernetes.io/projected/d112a61a-4828-4d29-b47d-ee894ca24784-kube-api-access-r7q6g\") pod \"d112a61a-4828-4d29-b47d-ee894ca24784\" (UID: \"d112a61a-4828-4d29-b47d-ee894ca24784\") " Nov 22 04:28:52 crc kubenswrapper[4699]: I1122 04:28:52.962341 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d112a61a-4828-4d29-b47d-ee894ca24784-ovndb-tls-certs\") pod \"d112a61a-4828-4d29-b47d-ee894ca24784\" (UID: \"d112a61a-4828-4d29-b47d-ee894ca24784\") " Nov 22 04:28:52 crc kubenswrapper[4699]: I1122 04:28:52.962409 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d112a61a-4828-4d29-b47d-ee894ca24784-httpd-config\") pod \"d112a61a-4828-4d29-b47d-ee894ca24784\" (UID: \"d112a61a-4828-4d29-b47d-ee894ca24784\") " Nov 22 04:28:52 crc kubenswrapper[4699]: I1122 04:28:52.974023 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d112a61a-4828-4d29-b47d-ee894ca24784-kube-api-access-r7q6g" (OuterVolumeSpecName: "kube-api-access-r7q6g") pod "d112a61a-4828-4d29-b47d-ee894ca24784" (UID: "d112a61a-4828-4d29-b47d-ee894ca24784"). InnerVolumeSpecName "kube-api-access-r7q6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:28:52 crc kubenswrapper[4699]: I1122 04:28:52.976883 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d112a61a-4828-4d29-b47d-ee894ca24784-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "d112a61a-4828-4d29-b47d-ee894ca24784" (UID: "d112a61a-4828-4d29-b47d-ee894ca24784"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:28:53 crc kubenswrapper[4699]: I1122 04:28:53.069167 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7q6g\" (UniqueName: \"kubernetes.io/projected/d112a61a-4828-4d29-b47d-ee894ca24784-kube-api-access-r7q6g\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:53 crc kubenswrapper[4699]: I1122 04:28:53.069848 4699 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d112a61a-4828-4d29-b47d-ee894ca24784-httpd-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:53 crc kubenswrapper[4699]: I1122 04:28:53.078877 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wjmlk"] Nov 22 04:28:53 crc kubenswrapper[4699]: I1122 04:28:53.133863 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d112a61a-4828-4d29-b47d-ee894ca24784-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d112a61a-4828-4d29-b47d-ee894ca24784" (UID: "d112a61a-4828-4d29-b47d-ee894ca24784"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:28:53 crc kubenswrapper[4699]: I1122 04:28:53.139732 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d112a61a-4828-4d29-b47d-ee894ca24784-config" (OuterVolumeSpecName: "config") pod "d112a61a-4828-4d29-b47d-ee894ca24784" (UID: "d112a61a-4828-4d29-b47d-ee894ca24784"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:28:53 crc kubenswrapper[4699]: I1122 04:28:53.165375 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d112a61a-4828-4d29-b47d-ee894ca24784-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "d112a61a-4828-4d29-b47d-ee894ca24784" (UID: "d112a61a-4828-4d29-b47d-ee894ca24784"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:28:53 crc kubenswrapper[4699]: I1122 04:28:53.172706 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d112a61a-4828-4d29-b47d-ee894ca24784-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:53 crc kubenswrapper[4699]: I1122 04:28:53.172758 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d112a61a-4828-4d29-b47d-ee894ca24784-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:53 crc kubenswrapper[4699]: I1122 04:28:53.172770 4699 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d112a61a-4828-4d29-b47d-ee894ca24784-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 04:28:53 crc kubenswrapper[4699]: I1122 04:28:53.702489 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"832e24bf-7e8a-4e3f-be16-acd8ecff139d","Type":"ContainerStarted","Data":"581cab41cacc1dd2b68bb2854b6abdcd29190d14487f8e90d07fcba4e4c4de93"} Nov 22 04:28:53 crc kubenswrapper[4699]: I1122 04:28:53.703417 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"832e24bf-7e8a-4e3f-be16-acd8ecff139d","Type":"ContainerStarted","Data":"f25414a683ba6dfae7040281e3b67a73338658be676fef98627ada0e4529e01b"} Nov 22 04:28:53 crc kubenswrapper[4699]: I1122 04:28:53.706396 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-wjmlk" event={"ID":"2543fbe9-12e0-40d5-8474-dab6ed3144be","Type":"ContainerStarted","Data":"d82b671dbea9378a40c53b750ba6eec4a341d49dc8f0555d825aea1d4737829c"} Nov 22 04:28:53 crc kubenswrapper[4699]: I1122 04:28:53.709402 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" event={"ID":"41bdbae2-706a-4f84-9f56-5a42aec77762","Type":"ContainerStarted","Data":"f9b23a2370657a76cf1f4f279dceac7c7bb8c31dc2586215719f3f3336390722"} Nov 22 04:28:53 crc kubenswrapper[4699]: I1122 04:28:53.716567 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f6d546c9b-wrks9" Nov 22 04:28:53 crc kubenswrapper[4699]: I1122 04:28:53.717136 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-65957c9c4f-4rj2b" event={"ID":"474af2c7-c72f-4420-94a9-4876e0dbd68e","Type":"ContainerStarted","Data":"3e0fbc26946db962e767aa6cb5317884efd9809d79c6253d81f7ff5ad38a9dbd"} Nov 22 04:28:53 crc kubenswrapper[4699]: I1122 04:28:53.717341 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-65957c9c4f-4rj2b" Nov 22 04:28:53 crc kubenswrapper[4699]: I1122 04:28:53.776573 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6f6d546c9b-wrks9"] Nov 22 04:28:53 crc kubenswrapper[4699]: I1122 04:28:53.787079 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6f6d546c9b-wrks9"] Nov 22 04:28:55 crc kubenswrapper[4699]: I1122 04:28:55.459511 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d112a61a-4828-4d29-b47d-ee894ca24784" path="/var/lib/kubelet/pods/d112a61a-4828-4d29-b47d-ee894ca24784/volumes" Nov 22 04:28:55 crc kubenswrapper[4699]: I1122 04:28:55.740079 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"832e24bf-7e8a-4e3f-be16-acd8ecff139d","Type":"ContainerStarted","Data":"eb069f9dcf6724775d3ca4f2ebe640a91eae455c2898ac9c24dabc926f70928b"} Nov 22 04:28:56 crc kubenswrapper[4699]: I1122 04:28:56.754875 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"832e24bf-7e8a-4e3f-be16-acd8ecff139d","Type":"ContainerStarted","Data":"2f7d5adb52501003a0b54d4dfbdfa423ec48807e22ea0f795308022f0fcde299"} Nov 22 04:28:56 crc kubenswrapper[4699]: I1122 04:28:56.755220 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 22 04:28:56 crc kubenswrapper[4699]: I1122 04:28:56.755080 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="832e24bf-7e8a-4e3f-be16-acd8ecff139d" containerName="sg-core" containerID="cri-o://eb069f9dcf6724775d3ca4f2ebe640a91eae455c2898ac9c24dabc926f70928b" gracePeriod=30 Nov 22 04:28:56 crc kubenswrapper[4699]: I1122 04:28:56.755033 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="832e24bf-7e8a-4e3f-be16-acd8ecff139d" containerName="proxy-httpd" containerID="cri-o://2f7d5adb52501003a0b54d4dfbdfa423ec48807e22ea0f795308022f0fcde299" gracePeriod=30 Nov 22 04:28:56 crc kubenswrapper[4699]: I1122 04:28:56.755092 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="832e24bf-7e8a-4e3f-be16-acd8ecff139d" containerName="ceilometer-notification-agent" containerID="cri-o://581cab41cacc1dd2b68bb2854b6abdcd29190d14487f8e90d07fcba4e4c4de93" gracePeriod=30 Nov 22 04:28:56 crc kubenswrapper[4699]: I1122 04:28:56.755116 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="832e24bf-7e8a-4e3f-be16-acd8ecff139d" containerName="ceilometer-central-agent" containerID="cri-o://f25414a683ba6dfae7040281e3b67a73338658be676fef98627ada0e4529e01b" gracePeriod=30 Nov 22 04:28:56 crc kubenswrapper[4699]: I1122 04:28:56.786349 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.308159594 podStartE2EDuration="24.786324565s" podCreationTimestamp="2025-11-22 04:28:32 +0000 UTC" firstStartedPulling="2025-11-22 04:28:33.318980722 +0000 UTC m=+1264.661601909" lastFinishedPulling="2025-11-22 04:28:55.797145693 +0000 UTC m=+1287.139766880" observedRunningTime="2025-11-22 04:28:56.781298383 +0000 UTC m=+1288.123919570" watchObservedRunningTime="2025-11-22 04:28:56.786324565 +0000 UTC m=+1288.128945752" Nov 22 04:28:57 crc kubenswrapper[4699]: I1122 04:28:57.770848 4699 generic.go:334] "Generic (PLEG): container finished" podID="832e24bf-7e8a-4e3f-be16-acd8ecff139d" containerID="2f7d5adb52501003a0b54d4dfbdfa423ec48807e22ea0f795308022f0fcde299" exitCode=0 Nov 22 04:28:57 crc kubenswrapper[4699]: I1122 04:28:57.771168 4699 generic.go:334] "Generic (PLEG): container finished" podID="832e24bf-7e8a-4e3f-be16-acd8ecff139d" containerID="eb069f9dcf6724775d3ca4f2ebe640a91eae455c2898ac9c24dabc926f70928b" exitCode=2 Nov 22 04:28:57 crc kubenswrapper[4699]: I1122 04:28:57.771176 4699 generic.go:334] "Generic (PLEG): container finished" podID="832e24bf-7e8a-4e3f-be16-acd8ecff139d" containerID="581cab41cacc1dd2b68bb2854b6abdcd29190d14487f8e90d07fcba4e4c4de93" exitCode=0 Nov 22 04:28:57 crc kubenswrapper[4699]: I1122 04:28:57.771182 4699 generic.go:334] "Generic (PLEG): container finished" podID="832e24bf-7e8a-4e3f-be16-acd8ecff139d" containerID="f25414a683ba6dfae7040281e3b67a73338658be676fef98627ada0e4529e01b" exitCode=0 Nov 22 04:28:57 crc kubenswrapper[4699]: I1122 04:28:57.771282 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"832e24bf-7e8a-4e3f-be16-acd8ecff139d","Type":"ContainerDied","Data":"2f7d5adb52501003a0b54d4dfbdfa423ec48807e22ea0f795308022f0fcde299"} Nov 22 04:28:57 crc kubenswrapper[4699]: I1122 04:28:57.771311 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"832e24bf-7e8a-4e3f-be16-acd8ecff139d","Type":"ContainerDied","Data":"eb069f9dcf6724775d3ca4f2ebe640a91eae455c2898ac9c24dabc926f70928b"} Nov 22 04:28:57 crc kubenswrapper[4699]: I1122 04:28:57.771325 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"832e24bf-7e8a-4e3f-be16-acd8ecff139d","Type":"ContainerDied","Data":"581cab41cacc1dd2b68bb2854b6abdcd29190d14487f8e90d07fcba4e4c4de93"} Nov 22 04:28:57 crc kubenswrapper[4699]: I1122 04:28:57.771337 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"832e24bf-7e8a-4e3f-be16-acd8ecff139d","Type":"ContainerDied","Data":"f25414a683ba6dfae7040281e3b67a73338658be676fef98627ada0e4529e01b"} Nov 22 04:28:59 crc kubenswrapper[4699]: I1122 04:28:59.237985 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 04:28:59 crc kubenswrapper[4699]: I1122 04:28:59.241852 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="0f028f32-9e14-40c5-9944-3fed1f6c2aee" containerName="glance-log" containerID="cri-o://af2485d201b5d7082d752db10c95f6a969bf19c959ffbcb882551a8bf0e3248a" gracePeriod=30 Nov 22 04:28:59 crc kubenswrapper[4699]: I1122 04:28:59.242256 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="0f028f32-9e14-40c5-9944-3fed1f6c2aee" containerName="glance-httpd" containerID="cri-o://9c3a6f68089699e05f57c467995c468566c3aee2a5f9f2e59cb387045899b545" gracePeriod=30 Nov 22 04:28:59 crc kubenswrapper[4699]: I1122 04:28:59.803602 4699 generic.go:334] "Generic (PLEG): container finished" podID="0f028f32-9e14-40c5-9944-3fed1f6c2aee" containerID="af2485d201b5d7082d752db10c95f6a969bf19c959ffbcb882551a8bf0e3248a" exitCode=143 Nov 22 04:28:59 crc kubenswrapper[4699]: I1122 04:28:59.803685 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0f028f32-9e14-40c5-9944-3fed1f6c2aee","Type":"ContainerDied","Data":"af2485d201b5d7082d752db10c95f6a969bf19c959ffbcb882551a8bf0e3248a"} Nov 22 04:29:00 crc kubenswrapper[4699]: I1122 04:29:00.117688 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 04:29:00 crc kubenswrapper[4699]: I1122 04:29:00.117942 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c9bd4fef-05a7-44fd-9c7d-dd9118839aa6" containerName="glance-log" containerID="cri-o://0965e0e4b0f9d33255e1ca3fe32acf964e60ff77ca0da40d7a5561ecbf5baff7" gracePeriod=30 Nov 22 04:29:00 crc kubenswrapper[4699]: I1122 04:29:00.118233 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c9bd4fef-05a7-44fd-9c7d-dd9118839aa6" containerName="glance-httpd" containerID="cri-o://adfcd113ddb0a425861801cc4ebb792f2beb998c582350cf8e362757c1b7afa5" gracePeriod=30 Nov 22 04:29:00 crc kubenswrapper[4699]: I1122 04:29:00.550375 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-neutron-agent-65957c9c4f-4rj2b" Nov 22 04:29:00 crc kubenswrapper[4699]: I1122 04:29:00.816600 4699 generic.go:334] "Generic (PLEG): container finished" podID="c9bd4fef-05a7-44fd-9c7d-dd9118839aa6" containerID="0965e0e4b0f9d33255e1ca3fe32acf964e60ff77ca0da40d7a5561ecbf5baff7" exitCode=143 Nov 22 04:29:00 crc kubenswrapper[4699]: I1122 04:29:00.816706 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c9bd4fef-05a7-44fd-9c7d-dd9118839aa6","Type":"ContainerDied","Data":"0965e0e4b0f9d33255e1ca3fe32acf964e60ff77ca0da40d7a5561ecbf5baff7"} Nov 22 04:29:03 crc kubenswrapper[4699]: I1122 04:29:03.618814 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="0f028f32-9e14-40c5-9944-3fed1f6c2aee" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.149:9292/healthcheck\": dial tcp 10.217.0.149:9292: connect: connection refused" Nov 22 04:29:03 crc kubenswrapper[4699]: I1122 04:29:03.619086 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="0f028f32-9e14-40c5-9944-3fed1f6c2aee" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.149:9292/healthcheck\": dial tcp 10.217.0.149:9292: connect: connection refused" Nov 22 04:29:03 crc kubenswrapper[4699]: I1122 04:29:03.691774 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="c9bd4fef-05a7-44fd-9c7d-dd9118839aa6" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.150:9292/healthcheck\": dial tcp 10.217.0.150:9292: connect: connection refused" Nov 22 04:29:03 crc kubenswrapper[4699]: I1122 04:29:03.691775 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="c9bd4fef-05a7-44fd-9c7d-dd9118839aa6" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.150:9292/healthcheck\": dial tcp 10.217.0.150:9292: connect: connection refused" Nov 22 04:29:03 crc kubenswrapper[4699]: I1122 04:29:03.732497 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 04:29:03 crc kubenswrapper[4699]: I1122 04:29:03.860353 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"832e24bf-7e8a-4e3f-be16-acd8ecff139d","Type":"ContainerDied","Data":"762caf951095c0a59e1aa64fb5ae81b24e27728a4e00654e262f5af667f931a9"} Nov 22 04:29:03 crc kubenswrapper[4699]: I1122 04:29:03.860409 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 04:29:03 crc kubenswrapper[4699]: I1122 04:29:03.860543 4699 scope.go:117] "RemoveContainer" containerID="2f7d5adb52501003a0b54d4dfbdfa423ec48807e22ea0f795308022f0fcde299" Nov 22 04:29:03 crc kubenswrapper[4699]: I1122 04:29:03.878070 4699 generic.go:334] "Generic (PLEG): container finished" podID="c9bd4fef-05a7-44fd-9c7d-dd9118839aa6" containerID="adfcd113ddb0a425861801cc4ebb792f2beb998c582350cf8e362757c1b7afa5" exitCode=0 Nov 22 04:29:03 crc kubenswrapper[4699]: I1122 04:29:03.878184 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c9bd4fef-05a7-44fd-9c7d-dd9118839aa6","Type":"ContainerDied","Data":"adfcd113ddb0a425861801cc4ebb792f2beb998c582350cf8e362757c1b7afa5"} Nov 22 04:29:03 crc kubenswrapper[4699]: I1122 04:29:03.881011 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-wjmlk" event={"ID":"2543fbe9-12e0-40d5-8474-dab6ed3144be","Type":"ContainerStarted","Data":"624b463938f550cc17b4e670673bf66a20b8b7cf86c27c5f9a60406c37b18761"} Nov 22 04:29:03 crc kubenswrapper[4699]: I1122 04:29:03.886891 4699 generic.go:334] "Generic (PLEG): container finished" podID="0f028f32-9e14-40c5-9944-3fed1f6c2aee" containerID="9c3a6f68089699e05f57c467995c468566c3aee2a5f9f2e59cb387045899b545" exitCode=0 Nov 22 04:29:03 crc kubenswrapper[4699]: I1122 04:29:03.886989 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0f028f32-9e14-40c5-9944-3fed1f6c2aee","Type":"ContainerDied","Data":"9c3a6f68089699e05f57c467995c468566c3aee2a5f9f2e59cb387045899b545"} Nov 22 04:29:03 crc kubenswrapper[4699]: I1122 04:29:03.889284 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b3f3d84b-ad88-4145-9e18-b2baa8eff9c4","Type":"ContainerStarted","Data":"ad61b071b6c92d9de0aa343abd8ebc2645d4c3839477d3884e2041d46448ca00"} Nov 22 04:29:03 crc kubenswrapper[4699]: I1122 04:29:03.904912 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-wjmlk" podStartSLOduration=14.462773722 podStartE2EDuration="24.904895983s" podCreationTimestamp="2025-11-22 04:28:39 +0000 UTC" firstStartedPulling="2025-11-22 04:28:53.088556643 +0000 UTC m=+1284.431177830" lastFinishedPulling="2025-11-22 04:29:03.530678904 +0000 UTC m=+1294.873300091" observedRunningTime="2025-11-22 04:29:03.901564585 +0000 UTC m=+1295.244185772" watchObservedRunningTime="2025-11-22 04:29:03.904895983 +0000 UTC m=+1295.247517170" Nov 22 04:29:03 crc kubenswrapper[4699]: I1122 04:29:03.910208 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/832e24bf-7e8a-4e3f-be16-acd8ecff139d-run-httpd\") pod \"832e24bf-7e8a-4e3f-be16-acd8ecff139d\" (UID: \"832e24bf-7e8a-4e3f-be16-acd8ecff139d\") " Nov 22 04:29:03 crc kubenswrapper[4699]: I1122 04:29:03.910276 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/832e24bf-7e8a-4e3f-be16-acd8ecff139d-combined-ca-bundle\") pod \"832e24bf-7e8a-4e3f-be16-acd8ecff139d\" (UID: \"832e24bf-7e8a-4e3f-be16-acd8ecff139d\") " Nov 22 04:29:03 crc kubenswrapper[4699]: I1122 04:29:03.910316 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/832e24bf-7e8a-4e3f-be16-acd8ecff139d-sg-core-conf-yaml\") pod \"832e24bf-7e8a-4e3f-be16-acd8ecff139d\" (UID: \"832e24bf-7e8a-4e3f-be16-acd8ecff139d\") " Nov 22 04:29:03 crc kubenswrapper[4699]: I1122 04:29:03.910428 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9cc9\" (UniqueName: \"kubernetes.io/projected/832e24bf-7e8a-4e3f-be16-acd8ecff139d-kube-api-access-q9cc9\") pod \"832e24bf-7e8a-4e3f-be16-acd8ecff139d\" (UID: \"832e24bf-7e8a-4e3f-be16-acd8ecff139d\") " Nov 22 04:29:03 crc kubenswrapper[4699]: I1122 04:29:03.910486 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/832e24bf-7e8a-4e3f-be16-acd8ecff139d-scripts\") pod \"832e24bf-7e8a-4e3f-be16-acd8ecff139d\" (UID: \"832e24bf-7e8a-4e3f-be16-acd8ecff139d\") " Nov 22 04:29:03 crc kubenswrapper[4699]: I1122 04:29:03.910570 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/832e24bf-7e8a-4e3f-be16-acd8ecff139d-config-data\") pod \"832e24bf-7e8a-4e3f-be16-acd8ecff139d\" (UID: \"832e24bf-7e8a-4e3f-be16-acd8ecff139d\") " Nov 22 04:29:03 crc kubenswrapper[4699]: I1122 04:29:03.910607 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/832e24bf-7e8a-4e3f-be16-acd8ecff139d-log-httpd\") pod \"832e24bf-7e8a-4e3f-be16-acd8ecff139d\" (UID: \"832e24bf-7e8a-4e3f-be16-acd8ecff139d\") " Nov 22 04:29:03 crc kubenswrapper[4699]: I1122 04:29:03.911572 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/832e24bf-7e8a-4e3f-be16-acd8ecff139d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "832e24bf-7e8a-4e3f-be16-acd8ecff139d" (UID: "832e24bf-7e8a-4e3f-be16-acd8ecff139d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:29:03 crc kubenswrapper[4699]: I1122 04:29:03.911890 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/832e24bf-7e8a-4e3f-be16-acd8ecff139d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "832e24bf-7e8a-4e3f-be16-acd8ecff139d" (UID: "832e24bf-7e8a-4e3f-be16-acd8ecff139d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:29:03 crc kubenswrapper[4699]: I1122 04:29:03.922131 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/832e24bf-7e8a-4e3f-be16-acd8ecff139d-kube-api-access-q9cc9" (OuterVolumeSpecName: "kube-api-access-q9cc9") pod "832e24bf-7e8a-4e3f-be16-acd8ecff139d" (UID: "832e24bf-7e8a-4e3f-be16-acd8ecff139d"). InnerVolumeSpecName "kube-api-access-q9cc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:29:03 crc kubenswrapper[4699]: I1122 04:29:03.922520 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/832e24bf-7e8a-4e3f-be16-acd8ecff139d-scripts" (OuterVolumeSpecName: "scripts") pod "832e24bf-7e8a-4e3f-be16-acd8ecff139d" (UID: "832e24bf-7e8a-4e3f-be16-acd8ecff139d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:29:03 crc kubenswrapper[4699]: I1122 04:29:03.932381 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.353963424 podStartE2EDuration="45.932330341s" podCreationTimestamp="2025-11-22 04:28:18 +0000 UTC" firstStartedPulling="2025-11-22 04:28:20.925190387 +0000 UTC m=+1252.267811584" lastFinishedPulling="2025-11-22 04:29:03.503557314 +0000 UTC m=+1294.846178501" observedRunningTime="2025-11-22 04:29:03.92680845 +0000 UTC m=+1295.269429637" watchObservedRunningTime="2025-11-22 04:29:03.932330341 +0000 UTC m=+1295.274951528" Nov 22 04:29:03 crc kubenswrapper[4699]: I1122 04:29:03.953358 4699 scope.go:117] "RemoveContainer" containerID="eb069f9dcf6724775d3ca4f2ebe640a91eae455c2898ac9c24dabc926f70928b" Nov 22 04:29:03 crc kubenswrapper[4699]: I1122 04:29:03.966895 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/832e24bf-7e8a-4e3f-be16-acd8ecff139d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "832e24bf-7e8a-4e3f-be16-acd8ecff139d" (UID: "832e24bf-7e8a-4e3f-be16-acd8ecff139d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.006885 4699 scope.go:117] "RemoveContainer" containerID="581cab41cacc1dd2b68bb2854b6abdcd29190d14487f8e90d07fcba4e4c4de93" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.013375 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.014051 4699 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/832e24bf-7e8a-4e3f-be16-acd8ecff139d-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.014111 4699 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/832e24bf-7e8a-4e3f-be16-acd8ecff139d-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.014124 4699 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/832e24bf-7e8a-4e3f-be16-acd8ecff139d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.014136 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9cc9\" (UniqueName: \"kubernetes.io/projected/832e24bf-7e8a-4e3f-be16-acd8ecff139d-kube-api-access-q9cc9\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.014148 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/832e24bf-7e8a-4e3f-be16-acd8ecff139d-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.042522 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.043580 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/832e24bf-7e8a-4e3f-be16-acd8ecff139d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "832e24bf-7e8a-4e3f-be16-acd8ecff139d" (UID: "832e24bf-7e8a-4e3f-be16-acd8ecff139d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.119827 4699 scope.go:117] "RemoveContainer" containerID="f25414a683ba6dfae7040281e3b67a73338658be676fef98627ada0e4529e01b" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.124605 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9bd4fef-05a7-44fd-9c7d-dd9118839aa6-internal-tls-certs\") pod \"c9bd4fef-05a7-44fd-9c7d-dd9118839aa6\" (UID: \"c9bd4fef-05a7-44fd-9c7d-dd9118839aa6\") " Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.124680 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9bd4fef-05a7-44fd-9c7d-dd9118839aa6-logs\") pod \"c9bd4fef-05a7-44fd-9c7d-dd9118839aa6\" (UID: \"c9bd4fef-05a7-44fd-9c7d-dd9118839aa6\") " Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.124775 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"0f028f32-9e14-40c5-9944-3fed1f6c2aee\" (UID: \"0f028f32-9e14-40c5-9944-3fed1f6c2aee\") " Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.124894 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f028f32-9e14-40c5-9944-3fed1f6c2aee-scripts\") pod \"0f028f32-9e14-40c5-9944-3fed1f6c2aee\" (UID: \"0f028f32-9e14-40c5-9944-3fed1f6c2aee\") " Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.124926 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9bd4fef-05a7-44fd-9c7d-dd9118839aa6-combined-ca-bundle\") pod \"c9bd4fef-05a7-44fd-9c7d-dd9118839aa6\" (UID: \"c9bd4fef-05a7-44fd-9c7d-dd9118839aa6\") " Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.124983 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"c9bd4fef-05a7-44fd-9c7d-dd9118839aa6\" (UID: \"c9bd4fef-05a7-44fd-9c7d-dd9118839aa6\") " Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.125011 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f028f32-9e14-40c5-9944-3fed1f6c2aee-combined-ca-bundle\") pod \"0f028f32-9e14-40c5-9944-3fed1f6c2aee\" (UID: \"0f028f32-9e14-40c5-9944-3fed1f6c2aee\") " Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.125034 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f028f32-9e14-40c5-9944-3fed1f6c2aee-logs\") pod \"0f028f32-9e14-40c5-9944-3fed1f6c2aee\" (UID: \"0f028f32-9e14-40c5-9944-3fed1f6c2aee\") " Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.125102 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9bd4fef-05a7-44fd-9c7d-dd9118839aa6-config-data\") pod \"c9bd4fef-05a7-44fd-9c7d-dd9118839aa6\" (UID: \"c9bd4fef-05a7-44fd-9c7d-dd9118839aa6\") " Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.125149 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f028f32-9e14-40c5-9944-3fed1f6c2aee-public-tls-certs\") pod \"0f028f32-9e14-40c5-9944-3fed1f6c2aee\" (UID: \"0f028f32-9e14-40c5-9944-3fed1f6c2aee\") " Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.125229 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0f028f32-9e14-40c5-9944-3fed1f6c2aee-httpd-run\") pod \"0f028f32-9e14-40c5-9944-3fed1f6c2aee\" (UID: \"0f028f32-9e14-40c5-9944-3fed1f6c2aee\") " Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.125287 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c9bd4fef-05a7-44fd-9c7d-dd9118839aa6-httpd-run\") pod \"c9bd4fef-05a7-44fd-9c7d-dd9118839aa6\" (UID: \"c9bd4fef-05a7-44fd-9c7d-dd9118839aa6\") " Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.125749 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9bd4fef-05a7-44fd-9c7d-dd9118839aa6-logs" (OuterVolumeSpecName: "logs") pod "c9bd4fef-05a7-44fd-9c7d-dd9118839aa6" (UID: "c9bd4fef-05a7-44fd-9c7d-dd9118839aa6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.125358 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f028f32-9e14-40c5-9944-3fed1f6c2aee-config-data\") pod \"0f028f32-9e14-40c5-9944-3fed1f6c2aee\" (UID: \"0f028f32-9e14-40c5-9944-3fed1f6c2aee\") " Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.125937 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9bd4fef-05a7-44fd-9c7d-dd9118839aa6-scripts\") pod \"c9bd4fef-05a7-44fd-9c7d-dd9118839aa6\" (UID: \"c9bd4fef-05a7-44fd-9c7d-dd9118839aa6\") " Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.125983 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjv87\" (UniqueName: \"kubernetes.io/projected/0f028f32-9e14-40c5-9944-3fed1f6c2aee-kube-api-access-cjv87\") pod \"0f028f32-9e14-40c5-9944-3fed1f6c2aee\" (UID: \"0f028f32-9e14-40c5-9944-3fed1f6c2aee\") " Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.126022 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pk7lv\" (UniqueName: \"kubernetes.io/projected/c9bd4fef-05a7-44fd-9c7d-dd9118839aa6-kube-api-access-pk7lv\") pod \"c9bd4fef-05a7-44fd-9c7d-dd9118839aa6\" (UID: \"c9bd4fef-05a7-44fd-9c7d-dd9118839aa6\") " Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.126729 4699 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9bd4fef-05a7-44fd-9c7d-dd9118839aa6-logs\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.126759 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/832e24bf-7e8a-4e3f-be16-acd8ecff139d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.127409 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f028f32-9e14-40c5-9944-3fed1f6c2aee-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0f028f32-9e14-40c5-9944-3fed1f6c2aee" (UID: "0f028f32-9e14-40c5-9944-3fed1f6c2aee"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.127764 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f028f32-9e14-40c5-9944-3fed1f6c2aee-logs" (OuterVolumeSpecName: "logs") pod "0f028f32-9e14-40c5-9944-3fed1f6c2aee" (UID: "0f028f32-9e14-40c5-9944-3fed1f6c2aee"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.129593 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9bd4fef-05a7-44fd-9c7d-dd9118839aa6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c9bd4fef-05a7-44fd-9c7d-dd9118839aa6" (UID: "c9bd4fef-05a7-44fd-9c7d-dd9118839aa6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.149903 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f028f32-9e14-40c5-9944-3fed1f6c2aee-scripts" (OuterVolumeSpecName: "scripts") pod "0f028f32-9e14-40c5-9944-3fed1f6c2aee" (UID: "0f028f32-9e14-40c5-9944-3fed1f6c2aee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.152663 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9bd4fef-05a7-44fd-9c7d-dd9118839aa6-scripts" (OuterVolumeSpecName: "scripts") pod "c9bd4fef-05a7-44fd-9c7d-dd9118839aa6" (UID: "c9bd4fef-05a7-44fd-9c7d-dd9118839aa6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.154468 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "0f028f32-9e14-40c5-9944-3fed1f6c2aee" (UID: "0f028f32-9e14-40c5-9944-3fed1f6c2aee"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.155755 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9bd4fef-05a7-44fd-9c7d-dd9118839aa6-kube-api-access-pk7lv" (OuterVolumeSpecName: "kube-api-access-pk7lv") pod "c9bd4fef-05a7-44fd-9c7d-dd9118839aa6" (UID: "c9bd4fef-05a7-44fd-9c7d-dd9118839aa6"). InnerVolumeSpecName "kube-api-access-pk7lv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.157775 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f028f32-9e14-40c5-9944-3fed1f6c2aee-kube-api-access-cjv87" (OuterVolumeSpecName: "kube-api-access-cjv87") pod "0f028f32-9e14-40c5-9944-3fed1f6c2aee" (UID: "0f028f32-9e14-40c5-9944-3fed1f6c2aee"). InnerVolumeSpecName "kube-api-access-cjv87". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.192158 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "c9bd4fef-05a7-44fd-9c7d-dd9118839aa6" (UID: "c9bd4fef-05a7-44fd-9c7d-dd9118839aa6"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.199912 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9bd4fef-05a7-44fd-9c7d-dd9118839aa6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9bd4fef-05a7-44fd-9c7d-dd9118839aa6" (UID: "c9bd4fef-05a7-44fd-9c7d-dd9118839aa6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.222839 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f028f32-9e14-40c5-9944-3fed1f6c2aee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0f028f32-9e14-40c5-9944-3fed1f6c2aee" (UID: "0f028f32-9e14-40c5-9944-3fed1f6c2aee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.231884 4699 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.231925 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f028f32-9e14-40c5-9944-3fed1f6c2aee-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.231937 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9bd4fef-05a7-44fd-9c7d-dd9118839aa6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.231956 4699 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.231967 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f028f32-9e14-40c5-9944-3fed1f6c2aee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.231977 4699 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f028f32-9e14-40c5-9944-3fed1f6c2aee-logs\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.231986 4699 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0f028f32-9e14-40c5-9944-3fed1f6c2aee-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.231998 4699 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c9bd4fef-05a7-44fd-9c7d-dd9118839aa6-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.232007 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9bd4fef-05a7-44fd-9c7d-dd9118839aa6-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.232019 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjv87\" (UniqueName: \"kubernetes.io/projected/0f028f32-9e14-40c5-9944-3fed1f6c2aee-kube-api-access-cjv87\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.232030 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pk7lv\" (UniqueName: \"kubernetes.io/projected/c9bd4fef-05a7-44fd-9c7d-dd9118839aa6-kube-api-access-pk7lv\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.252240 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9bd4fef-05a7-44fd-9c7d-dd9118839aa6-config-data" (OuterVolumeSpecName: "config-data") pod "c9bd4fef-05a7-44fd-9c7d-dd9118839aa6" (UID: "c9bd4fef-05a7-44fd-9c7d-dd9118839aa6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.254776 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f028f32-9e14-40c5-9944-3fed1f6c2aee-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0f028f32-9e14-40c5-9944-3fed1f6c2aee" (UID: "0f028f32-9e14-40c5-9944-3fed1f6c2aee"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.285256 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/832e24bf-7e8a-4e3f-be16-acd8ecff139d-config-data" (OuterVolumeSpecName: "config-data") pod "832e24bf-7e8a-4e3f-be16-acd8ecff139d" (UID: "832e24bf-7e8a-4e3f-be16-acd8ecff139d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.285549 4699 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.290377 4699 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.313191 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9bd4fef-05a7-44fd-9c7d-dd9118839aa6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c9bd4fef-05a7-44fd-9c7d-dd9118839aa6" (UID: "c9bd4fef-05a7-44fd-9c7d-dd9118839aa6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.313745 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f028f32-9e14-40c5-9944-3fed1f6c2aee-config-data" (OuterVolumeSpecName: "config-data") pod "0f028f32-9e14-40c5-9944-3fed1f6c2aee" (UID: "0f028f32-9e14-40c5-9944-3fed1f6c2aee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.334115 4699 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.334508 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9bd4fef-05a7-44fd-9c7d-dd9118839aa6-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.334701 4699 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f028f32-9e14-40c5-9944-3fed1f6c2aee-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.334821 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f028f32-9e14-40c5-9944-3fed1f6c2aee-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.334906 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/832e24bf-7e8a-4e3f-be16-acd8ecff139d-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.334976 4699 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9bd4fef-05a7-44fd-9c7d-dd9118839aa6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.335055 4699 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.509302 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.527609 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.544972 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 22 04:29:04 crc kubenswrapper[4699]: E1122 04:29:04.549832 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da52be58-8760-4d0a-866a-9eb3b47b2e8b" containerName="ironic-api" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.550020 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="da52be58-8760-4d0a-866a-9eb3b47b2e8b" containerName="ironic-api" Nov 22 04:29:04 crc kubenswrapper[4699]: E1122 04:29:04.550113 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="832e24bf-7e8a-4e3f-be16-acd8ecff139d" containerName="ceilometer-notification-agent" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.550179 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="832e24bf-7e8a-4e3f-be16-acd8ecff139d" containerName="ceilometer-notification-agent" Nov 22 04:29:04 crc kubenswrapper[4699]: E1122 04:29:04.550252 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da52be58-8760-4d0a-866a-9eb3b47b2e8b" containerName="init" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.550313 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="da52be58-8760-4d0a-866a-9eb3b47b2e8b" containerName="init" Nov 22 04:29:04 crc kubenswrapper[4699]: E1122 04:29:04.550382 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="832e24bf-7e8a-4e3f-be16-acd8ecff139d" containerName="ceilometer-central-agent" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.550453 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="832e24bf-7e8a-4e3f-be16-acd8ecff139d" containerName="ceilometer-central-agent" Nov 22 04:29:04 crc kubenswrapper[4699]: E1122 04:29:04.550524 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e987a5a1-15e5-43db-b896-d68d46cf841d" containerName="mariadb-database-create" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.550588 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="e987a5a1-15e5-43db-b896-d68d46cf841d" containerName="mariadb-database-create" Nov 22 04:29:04 crc kubenswrapper[4699]: E1122 04:29:04.550664 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d971429-cae7-4fed-9849-343ec7364f54" containerName="mariadb-account-create" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.550926 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d971429-cae7-4fed-9849-343ec7364f54" containerName="mariadb-account-create" Nov 22 04:29:04 crc kubenswrapper[4699]: E1122 04:29:04.551026 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="832e24bf-7e8a-4e3f-be16-acd8ecff139d" containerName="proxy-httpd" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.551105 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="832e24bf-7e8a-4e3f-be16-acd8ecff139d" containerName="proxy-httpd" Nov 22 04:29:04 crc kubenswrapper[4699]: E1122 04:29:04.551178 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="832e24bf-7e8a-4e3f-be16-acd8ecff139d" containerName="sg-core" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.551250 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="832e24bf-7e8a-4e3f-be16-acd8ecff139d" containerName="sg-core" Nov 22 04:29:04 crc kubenswrapper[4699]: E1122 04:29:04.551347 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adab83b4-5c89-4ecd-af55-56492c7421b3" containerName="mariadb-database-create" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.551420 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="adab83b4-5c89-4ecd-af55-56492c7421b3" containerName="mariadb-database-create" Nov 22 04:29:04 crc kubenswrapper[4699]: E1122 04:29:04.551547 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f028f32-9e14-40c5-9944-3fed1f6c2aee" containerName="glance-log" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.551617 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f028f32-9e14-40c5-9944-3fed1f6c2aee" containerName="glance-log" Nov 22 04:29:04 crc kubenswrapper[4699]: E1122 04:29:04.551752 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d112a61a-4828-4d29-b47d-ee894ca24784" containerName="neutron-api" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.551833 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="d112a61a-4828-4d29-b47d-ee894ca24784" containerName="neutron-api" Nov 22 04:29:04 crc kubenswrapper[4699]: E1122 04:29:04.551930 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9bd4fef-05a7-44fd-9c7d-dd9118839aa6" containerName="glance-log" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.551996 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9bd4fef-05a7-44fd-9c7d-dd9118839aa6" containerName="glance-log" Nov 22 04:29:04 crc kubenswrapper[4699]: E1122 04:29:04.552182 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f028f32-9e14-40c5-9944-3fed1f6c2aee" containerName="glance-httpd" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.552241 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f028f32-9e14-40c5-9944-3fed1f6c2aee" containerName="glance-httpd" Nov 22 04:29:04 crc kubenswrapper[4699]: E1122 04:29:04.552312 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9bd4fef-05a7-44fd-9c7d-dd9118839aa6" containerName="glance-httpd" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.552365 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9bd4fef-05a7-44fd-9c7d-dd9118839aa6" containerName="glance-httpd" Nov 22 04:29:04 crc kubenswrapper[4699]: E1122 04:29:04.552459 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da52be58-8760-4d0a-866a-9eb3b47b2e8b" containerName="ironic-api-log" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.552516 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="da52be58-8760-4d0a-866a-9eb3b47b2e8b" containerName="ironic-api-log" Nov 22 04:29:04 crc kubenswrapper[4699]: E1122 04:29:04.552583 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d112a61a-4828-4d29-b47d-ee894ca24784" containerName="neutron-httpd" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.552635 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="d112a61a-4828-4d29-b47d-ee894ca24784" containerName="neutron-httpd" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.553613 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="da52be58-8760-4d0a-866a-9eb3b47b2e8b" containerName="ironic-api" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.553734 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9bd4fef-05a7-44fd-9c7d-dd9118839aa6" containerName="glance-log" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.553817 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f028f32-9e14-40c5-9944-3fed1f6c2aee" containerName="glance-log" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.553928 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9bd4fef-05a7-44fd-9c7d-dd9118839aa6" containerName="glance-httpd" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.554017 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="adab83b4-5c89-4ecd-af55-56492c7421b3" containerName="mariadb-database-create" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.554115 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="d112a61a-4828-4d29-b47d-ee894ca24784" containerName="neutron-api" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.554192 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="832e24bf-7e8a-4e3f-be16-acd8ecff139d" containerName="proxy-httpd" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.554279 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="832e24bf-7e8a-4e3f-be16-acd8ecff139d" containerName="ceilometer-notification-agent" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.554369 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="d112a61a-4828-4d29-b47d-ee894ca24784" containerName="neutron-httpd" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.554485 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="da52be58-8760-4d0a-866a-9eb3b47b2e8b" containerName="ironic-api-log" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.554594 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="832e24bf-7e8a-4e3f-be16-acd8ecff139d" containerName="sg-core" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.554703 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f028f32-9e14-40c5-9944-3fed1f6c2aee" containerName="glance-httpd" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.554816 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d971429-cae7-4fed-9849-343ec7364f54" containerName="mariadb-account-create" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.554902 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="832e24bf-7e8a-4e3f-be16-acd8ecff139d" containerName="ceilometer-central-agent" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.555006 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="e987a5a1-15e5-43db-b896-d68d46cf841d" containerName="mariadb-database-create" Nov 22 04:29:04 crc kubenswrapper[4699]: E1122 04:29:04.555838 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da52be58-8760-4d0a-866a-9eb3b47b2e8b" containerName="ironic-api" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.555934 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="da52be58-8760-4d0a-866a-9eb3b47b2e8b" containerName="ironic-api" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.556720 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="da52be58-8760-4d0a-866a-9eb3b47b2e8b" containerName="ironic-api" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.562554 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.564220 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.573950 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.575033 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.644180 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/66de844d-ddf5-4823-8292-8611e737acd4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"66de844d-ddf5-4823-8292-8611e737acd4\") " pod="openstack/ceilometer-0" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.644361 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66de844d-ddf5-4823-8292-8611e737acd4-log-httpd\") pod \"ceilometer-0\" (UID: \"66de844d-ddf5-4823-8292-8611e737acd4\") " pod="openstack/ceilometer-0" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.644399 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccnvr\" (UniqueName: \"kubernetes.io/projected/66de844d-ddf5-4823-8292-8611e737acd4-kube-api-access-ccnvr\") pod \"ceilometer-0\" (UID: \"66de844d-ddf5-4823-8292-8611e737acd4\") " pod="openstack/ceilometer-0" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.644456 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66de844d-ddf5-4823-8292-8611e737acd4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"66de844d-ddf5-4823-8292-8611e737acd4\") " pod="openstack/ceilometer-0" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.644491 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66de844d-ddf5-4823-8292-8611e737acd4-scripts\") pod \"ceilometer-0\" (UID: \"66de844d-ddf5-4823-8292-8611e737acd4\") " pod="openstack/ceilometer-0" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.644510 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66de844d-ddf5-4823-8292-8611e737acd4-config-data\") pod \"ceilometer-0\" (UID: \"66de844d-ddf5-4823-8292-8611e737acd4\") " pod="openstack/ceilometer-0" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.644539 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66de844d-ddf5-4823-8292-8611e737acd4-run-httpd\") pod \"ceilometer-0\" (UID: \"66de844d-ddf5-4823-8292-8611e737acd4\") " pod="openstack/ceilometer-0" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.746569 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66de844d-ddf5-4823-8292-8611e737acd4-config-data\") pod \"ceilometer-0\" (UID: \"66de844d-ddf5-4823-8292-8611e737acd4\") " pod="openstack/ceilometer-0" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.746656 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66de844d-ddf5-4823-8292-8611e737acd4-run-httpd\") pod \"ceilometer-0\" (UID: \"66de844d-ddf5-4823-8292-8611e737acd4\") " pod="openstack/ceilometer-0" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.746763 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/66de844d-ddf5-4823-8292-8611e737acd4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"66de844d-ddf5-4823-8292-8611e737acd4\") " pod="openstack/ceilometer-0" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.746869 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66de844d-ddf5-4823-8292-8611e737acd4-log-httpd\") pod \"ceilometer-0\" (UID: \"66de844d-ddf5-4823-8292-8611e737acd4\") " pod="openstack/ceilometer-0" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.746900 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccnvr\" (UniqueName: \"kubernetes.io/projected/66de844d-ddf5-4823-8292-8611e737acd4-kube-api-access-ccnvr\") pod \"ceilometer-0\" (UID: \"66de844d-ddf5-4823-8292-8611e737acd4\") " pod="openstack/ceilometer-0" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.746959 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66de844d-ddf5-4823-8292-8611e737acd4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"66de844d-ddf5-4823-8292-8611e737acd4\") " pod="openstack/ceilometer-0" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.747010 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66de844d-ddf5-4823-8292-8611e737acd4-scripts\") pod \"ceilometer-0\" (UID: \"66de844d-ddf5-4823-8292-8611e737acd4\") " pod="openstack/ceilometer-0" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.747327 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66de844d-ddf5-4823-8292-8611e737acd4-run-httpd\") pod \"ceilometer-0\" (UID: \"66de844d-ddf5-4823-8292-8611e737acd4\") " pod="openstack/ceilometer-0" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.747395 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66de844d-ddf5-4823-8292-8611e737acd4-log-httpd\") pod \"ceilometer-0\" (UID: \"66de844d-ddf5-4823-8292-8611e737acd4\") " pod="openstack/ceilometer-0" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.751424 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66de844d-ddf5-4823-8292-8611e737acd4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"66de844d-ddf5-4823-8292-8611e737acd4\") " pod="openstack/ceilometer-0" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.751453 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66de844d-ddf5-4823-8292-8611e737acd4-scripts\") pod \"ceilometer-0\" (UID: \"66de844d-ddf5-4823-8292-8611e737acd4\") " pod="openstack/ceilometer-0" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.762252 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/66de844d-ddf5-4823-8292-8611e737acd4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"66de844d-ddf5-4823-8292-8611e737acd4\") " pod="openstack/ceilometer-0" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.769244 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66de844d-ddf5-4823-8292-8611e737acd4-config-data\") pod \"ceilometer-0\" (UID: \"66de844d-ddf5-4823-8292-8611e737acd4\") " pod="openstack/ceilometer-0" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.770063 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccnvr\" (UniqueName: \"kubernetes.io/projected/66de844d-ddf5-4823-8292-8611e737acd4-kube-api-access-ccnvr\") pod \"ceilometer-0\" (UID: \"66de844d-ddf5-4823-8292-8611e737acd4\") " pod="openstack/ceilometer-0" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.903868 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.928962 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c9bd4fef-05a7-44fd-9c7d-dd9118839aa6","Type":"ContainerDied","Data":"2a94d26bbb79e144ca368c998909adc7edaf85b5f7dc3b12851ebc2340147d36"} Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.929032 4699 scope.go:117] "RemoveContainer" containerID="adfcd113ddb0a425861801cc4ebb792f2beb998c582350cf8e362757c1b7afa5" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.929288 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.941939 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 04:29:04 crc kubenswrapper[4699]: I1122 04:29:04.942409 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0f028f32-9e14-40c5-9944-3fed1f6c2aee","Type":"ContainerDied","Data":"7b1dc965d67858085fc71a25f103fa3eb8408de947fee264fd2c8557578b21e7"} Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.033876 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.041701 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.051721 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.064766 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.075836 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.077736 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.079618 4699 scope.go:117] "RemoveContainer" containerID="0965e0e4b0f9d33255e1ca3fe32acf964e60ff77ca0da40d7a5561ecbf5baff7" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.085872 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-4kqxm" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.086050 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.086247 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.086114 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.103596 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.105532 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.110157 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.110482 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.120380 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.120750 4699 scope.go:117] "RemoveContainer" containerID="9c3a6f68089699e05f57c467995c468566c3aee2a5f9f2e59cb387045899b545" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.128602 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.167448 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0c750d0-0c65-4609-8ce0-5634ce490fc2-logs\") pod \"glance-default-internal-api-0\" (UID: \"c0c750d0-0c65-4609-8ce0-5634ce490fc2\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.167718 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0c750d0-0c65-4609-8ce0-5634ce490fc2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c0c750d0-0c65-4609-8ce0-5634ce490fc2\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.167821 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0c750d0-0c65-4609-8ce0-5634ce490fc2-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c0c750d0-0c65-4609-8ce0-5634ce490fc2\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.167913 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0c750d0-0c65-4609-8ce0-5634ce490fc2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c0c750d0-0c65-4609-8ce0-5634ce490fc2\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.168007 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"c0c750d0-0c65-4609-8ce0-5634ce490fc2\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.168102 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0c750d0-0c65-4609-8ce0-5634ce490fc2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c0c750d0-0c65-4609-8ce0-5634ce490fc2\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.168179 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c0c750d0-0c65-4609-8ce0-5634ce490fc2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c0c750d0-0c65-4609-8ce0-5634ce490fc2\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.168245 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr8jq\" (UniqueName: \"kubernetes.io/projected/c0c750d0-0c65-4609-8ce0-5634ce490fc2-kube-api-access-tr8jq\") pod \"glance-default-internal-api-0\" (UID: \"c0c750d0-0c65-4609-8ce0-5634ce490fc2\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.178564 4699 scope.go:117] "RemoveContainer" containerID="af2485d201b5d7082d752db10c95f6a969bf19c959ffbcb882551a8bf0e3248a" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.272614 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"c0c750d0-0c65-4609-8ce0-5634ce490fc2\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.272698 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"3e4aa03f-40fe-45cb-8a03-445afd58f5b7\") " pod="openstack/glance-default-external-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.272733 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e4aa03f-40fe-45cb-8a03-445afd58f5b7-config-data\") pod \"glance-default-external-api-0\" (UID: \"3e4aa03f-40fe-45cb-8a03-445afd58f5b7\") " pod="openstack/glance-default-external-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.272769 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3e4aa03f-40fe-45cb-8a03-445afd58f5b7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3e4aa03f-40fe-45cb-8a03-445afd58f5b7\") " pod="openstack/glance-default-external-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.272794 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e4aa03f-40fe-45cb-8a03-445afd58f5b7-scripts\") pod \"glance-default-external-api-0\" (UID: \"3e4aa03f-40fe-45cb-8a03-445afd58f5b7\") " pod="openstack/glance-default-external-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.272838 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0c750d0-0c65-4609-8ce0-5634ce490fc2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c0c750d0-0c65-4609-8ce0-5634ce490fc2\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.272862 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tr8jq\" (UniqueName: \"kubernetes.io/projected/c0c750d0-0c65-4609-8ce0-5634ce490fc2-kube-api-access-tr8jq\") pod \"glance-default-internal-api-0\" (UID: \"c0c750d0-0c65-4609-8ce0-5634ce490fc2\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.272886 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c0c750d0-0c65-4609-8ce0-5634ce490fc2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c0c750d0-0c65-4609-8ce0-5634ce490fc2\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.272949 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0c750d0-0c65-4609-8ce0-5634ce490fc2-logs\") pod \"glance-default-internal-api-0\" (UID: \"c0c750d0-0c65-4609-8ce0-5634ce490fc2\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.272983 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e4aa03f-40fe-45cb-8a03-445afd58f5b7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3e4aa03f-40fe-45cb-8a03-445afd58f5b7\") " pod="openstack/glance-default-external-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.273017 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e4aa03f-40fe-45cb-8a03-445afd58f5b7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3e4aa03f-40fe-45cb-8a03-445afd58f5b7\") " pod="openstack/glance-default-external-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.273056 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0c750d0-0c65-4609-8ce0-5634ce490fc2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c0c750d0-0c65-4609-8ce0-5634ce490fc2\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.273104 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpgwc\" (UniqueName: \"kubernetes.io/projected/3e4aa03f-40fe-45cb-8a03-445afd58f5b7-kube-api-access-tpgwc\") pod \"glance-default-external-api-0\" (UID: \"3e4aa03f-40fe-45cb-8a03-445afd58f5b7\") " pod="openstack/glance-default-external-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.273133 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0c750d0-0c65-4609-8ce0-5634ce490fc2-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c0c750d0-0c65-4609-8ce0-5634ce490fc2\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.273159 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0c750d0-0c65-4609-8ce0-5634ce490fc2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c0c750d0-0c65-4609-8ce0-5634ce490fc2\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.273179 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e4aa03f-40fe-45cb-8a03-445afd58f5b7-logs\") pod \"glance-default-external-api-0\" (UID: \"3e4aa03f-40fe-45cb-8a03-445afd58f5b7\") " pod="openstack/glance-default-external-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.273904 4699 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"c0c750d0-0c65-4609-8ce0-5634ce490fc2\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.279877 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c0c750d0-0c65-4609-8ce0-5634ce490fc2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c0c750d0-0c65-4609-8ce0-5634ce490fc2\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.287596 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0c750d0-0c65-4609-8ce0-5634ce490fc2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c0c750d0-0c65-4609-8ce0-5634ce490fc2\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.287878 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0c750d0-0c65-4609-8ce0-5634ce490fc2-logs\") pod \"glance-default-internal-api-0\" (UID: \"c0c750d0-0c65-4609-8ce0-5634ce490fc2\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.302064 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0c750d0-0c65-4609-8ce0-5634ce490fc2-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c0c750d0-0c65-4609-8ce0-5634ce490fc2\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.314639 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0c750d0-0c65-4609-8ce0-5634ce490fc2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c0c750d0-0c65-4609-8ce0-5634ce490fc2\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.319891 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr8jq\" (UniqueName: \"kubernetes.io/projected/c0c750d0-0c65-4609-8ce0-5634ce490fc2-kube-api-access-tr8jq\") pod \"glance-default-internal-api-0\" (UID: \"c0c750d0-0c65-4609-8ce0-5634ce490fc2\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.324582 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0c750d0-0c65-4609-8ce0-5634ce490fc2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c0c750d0-0c65-4609-8ce0-5634ce490fc2\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.339275 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"c0c750d0-0c65-4609-8ce0-5634ce490fc2\") " pod="openstack/glance-default-internal-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.374459 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"3e4aa03f-40fe-45cb-8a03-445afd58f5b7\") " pod="openstack/glance-default-external-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.374513 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e4aa03f-40fe-45cb-8a03-445afd58f5b7-config-data\") pod \"glance-default-external-api-0\" (UID: \"3e4aa03f-40fe-45cb-8a03-445afd58f5b7\") " pod="openstack/glance-default-external-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.374545 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3e4aa03f-40fe-45cb-8a03-445afd58f5b7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3e4aa03f-40fe-45cb-8a03-445afd58f5b7\") " pod="openstack/glance-default-external-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.374565 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e4aa03f-40fe-45cb-8a03-445afd58f5b7-scripts\") pod \"glance-default-external-api-0\" (UID: \"3e4aa03f-40fe-45cb-8a03-445afd58f5b7\") " pod="openstack/glance-default-external-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.374631 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e4aa03f-40fe-45cb-8a03-445afd58f5b7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3e4aa03f-40fe-45cb-8a03-445afd58f5b7\") " pod="openstack/glance-default-external-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.374654 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e4aa03f-40fe-45cb-8a03-445afd58f5b7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3e4aa03f-40fe-45cb-8a03-445afd58f5b7\") " pod="openstack/glance-default-external-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.374698 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpgwc\" (UniqueName: \"kubernetes.io/projected/3e4aa03f-40fe-45cb-8a03-445afd58f5b7-kube-api-access-tpgwc\") pod \"glance-default-external-api-0\" (UID: \"3e4aa03f-40fe-45cb-8a03-445afd58f5b7\") " pod="openstack/glance-default-external-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.374721 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e4aa03f-40fe-45cb-8a03-445afd58f5b7-logs\") pod \"glance-default-external-api-0\" (UID: \"3e4aa03f-40fe-45cb-8a03-445afd58f5b7\") " pod="openstack/glance-default-external-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.376096 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e4aa03f-40fe-45cb-8a03-445afd58f5b7-logs\") pod \"glance-default-external-api-0\" (UID: \"3e4aa03f-40fe-45cb-8a03-445afd58f5b7\") " pod="openstack/glance-default-external-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.376353 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3e4aa03f-40fe-45cb-8a03-445afd58f5b7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3e4aa03f-40fe-45cb-8a03-445afd58f5b7\") " pod="openstack/glance-default-external-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.376706 4699 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"3e4aa03f-40fe-45cb-8a03-445afd58f5b7\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-external-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.388061 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e4aa03f-40fe-45cb-8a03-445afd58f5b7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3e4aa03f-40fe-45cb-8a03-445afd58f5b7\") " pod="openstack/glance-default-external-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.392883 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e4aa03f-40fe-45cb-8a03-445afd58f5b7-config-data\") pod \"glance-default-external-api-0\" (UID: \"3e4aa03f-40fe-45cb-8a03-445afd58f5b7\") " pod="openstack/glance-default-external-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.393147 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e4aa03f-40fe-45cb-8a03-445afd58f5b7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3e4aa03f-40fe-45cb-8a03-445afd58f5b7\") " pod="openstack/glance-default-external-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.396978 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e4aa03f-40fe-45cb-8a03-445afd58f5b7-scripts\") pod \"glance-default-external-api-0\" (UID: \"3e4aa03f-40fe-45cb-8a03-445afd58f5b7\") " pod="openstack/glance-default-external-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.421177 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpgwc\" (UniqueName: \"kubernetes.io/projected/3e4aa03f-40fe-45cb-8a03-445afd58f5b7-kube-api-access-tpgwc\") pod \"glance-default-external-api-0\" (UID: \"3e4aa03f-40fe-45cb-8a03-445afd58f5b7\") " pod="openstack/glance-default-external-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.441929 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.472043 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f028f32-9e14-40c5-9944-3fed1f6c2aee" path="/var/lib/kubelet/pods/0f028f32-9e14-40c5-9944-3fed1f6c2aee/volumes" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.473005 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="832e24bf-7e8a-4e3f-be16-acd8ecff139d" path="/var/lib/kubelet/pods/832e24bf-7e8a-4e3f-be16-acd8ecff139d/volumes" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.475390 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9bd4fef-05a7-44fd-9c7d-dd9118839aa6" path="/var/lib/kubelet/pods/c9bd4fef-05a7-44fd-9c7d-dd9118839aa6/volumes" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.476048 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.504815 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"3e4aa03f-40fe-45cb-8a03-445afd58f5b7\") " pod="openstack/glance-default-external-api-0" Nov 22 04:29:05 crc kubenswrapper[4699]: I1122 04:29:05.705529 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 04:29:06 crc kubenswrapper[4699]: I1122 04:29:05.999771 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-sync-kdn8x" event={"ID":"57ead407-5bf6-4cc4-ac17-e939d329f220","Type":"ContainerStarted","Data":"8e1f5667779fa293fc95229d15ce15a9f39614ef08c134571d8b731fa48c5249"} Nov 22 04:29:06 crc kubenswrapper[4699]: I1122 04:29:06.008140 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66de844d-ddf5-4823-8292-8611e737acd4","Type":"ContainerStarted","Data":"826c0c30c91b03d60a5a8d697c887448322e253aff9d5cf44e13570f83c1cffe"} Nov 22 04:29:06 crc kubenswrapper[4699]: I1122 04:29:06.021280 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-inspector-db-sync-kdn8x" podStartSLOduration=3.170863279 podStartE2EDuration="36.021265738s" podCreationTimestamp="2025-11-22 04:28:30 +0000 UTC" firstStartedPulling="2025-11-22 04:28:32.045977868 +0000 UTC m=+1263.388599055" lastFinishedPulling="2025-11-22 04:29:04.896380327 +0000 UTC m=+1296.239001514" observedRunningTime="2025-11-22 04:29:06.019328492 +0000 UTC m=+1297.361949679" watchObservedRunningTime="2025-11-22 04:29:06.021265738 +0000 UTC m=+1297.363886925" Nov 22 04:29:06 crc kubenswrapper[4699]: I1122 04:29:06.155756 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 04:29:06 crc kubenswrapper[4699]: I1122 04:29:06.398373 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 04:29:07 crc kubenswrapper[4699]: I1122 04:29:07.056128 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c0c750d0-0c65-4609-8ce0-5634ce490fc2","Type":"ContainerStarted","Data":"25370ac5d006b9d922e50fa59eb9c7a5b4db81e37c59a7be5fa0583543cc8750"} Nov 22 04:29:07 crc kubenswrapper[4699]: I1122 04:29:07.060183 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66de844d-ddf5-4823-8292-8611e737acd4","Type":"ContainerStarted","Data":"3612d436cf3c59cf5e082b0a36e50bf22168d80ffc9393357d86545635afdabb"} Nov 22 04:29:07 crc kubenswrapper[4699]: I1122 04:29:07.061171 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3e4aa03f-40fe-45cb-8a03-445afd58f5b7","Type":"ContainerStarted","Data":"902ab40997ad2f510057ff5d34a70c4f6f1d9216f4025a3fac324cc9cc55e622"} Nov 22 04:29:08 crc kubenswrapper[4699]: I1122 04:29:08.072357 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c0c750d0-0c65-4609-8ce0-5634ce490fc2","Type":"ContainerStarted","Data":"f816cb4fd6350ec04638a50dd13badc7abce2db18a23dcf17577559b6bf942f8"} Nov 22 04:29:08 crc kubenswrapper[4699]: I1122 04:29:08.075282 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66de844d-ddf5-4823-8292-8611e737acd4","Type":"ContainerStarted","Data":"637112fec54059facbc8586385646b8d619c897f6cc7b19f26c64a673c6a2aac"} Nov 22 04:29:08 crc kubenswrapper[4699]: I1122 04:29:08.077152 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3e4aa03f-40fe-45cb-8a03-445afd58f5b7","Type":"ContainerStarted","Data":"c4aa829de219cb1058bf176156eb4b43a65f8511fe43fd71119f3965ca3e4fc1"} Nov 22 04:29:08 crc kubenswrapper[4699]: I1122 04:29:08.080090 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"6b0a42c8-e8a1-45b3-9f29-77459d98ea4d","Type":"ContainerStarted","Data":"ed069f697e5beb0b68f6da32dfb9b0f6c439f25643b5daeed8781aec98fd2d56"} Nov 22 04:29:09 crc kubenswrapper[4699]: I1122 04:29:09.092012 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3e4aa03f-40fe-45cb-8a03-445afd58f5b7","Type":"ContainerStarted","Data":"e9c5c41a7a87b7a811c087b526f47a32881b32fcbcf06377df80c6d4097d83c8"} Nov 22 04:29:09 crc kubenswrapper[4699]: I1122 04:29:09.093667 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c0c750d0-0c65-4609-8ce0-5634ce490fc2","Type":"ContainerStarted","Data":"ebd11b35bd67cbc7a4acd05014f3f12535bf53b02257234647ea7ae7561f2eed"} Nov 22 04:29:09 crc kubenswrapper[4699]: I1122 04:29:09.116344 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.116323432 podStartE2EDuration="4.116323432s" podCreationTimestamp="2025-11-22 04:29:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:29:09.11285066 +0000 UTC m=+1300.455471847" watchObservedRunningTime="2025-11-22 04:29:09.116323432 +0000 UTC m=+1300.458944629" Nov 22 04:29:09 crc kubenswrapper[4699]: I1122 04:29:09.135540 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.135525075 podStartE2EDuration="4.135525075s" podCreationTimestamp="2025-11-22 04:29:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:29:09.133352434 +0000 UTC m=+1300.475973621" watchObservedRunningTime="2025-11-22 04:29:09.135525075 +0000 UTC m=+1300.478146262" Nov 22 04:29:10 crc kubenswrapper[4699]: I1122 04:29:10.111928 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66de844d-ddf5-4823-8292-8611e737acd4","Type":"ContainerStarted","Data":"05588c2751ba03288b6e0cce420814e1b4e7334758b23369489af8b03a42e9ca"} Nov 22 04:29:13 crc kubenswrapper[4699]: I1122 04:29:13.137996 4699 generic.go:334] "Generic (PLEG): container finished" podID="6b0a42c8-e8a1-45b3-9f29-77459d98ea4d" containerID="ed069f697e5beb0b68f6da32dfb9b0f6c439f25643b5daeed8781aec98fd2d56" exitCode=0 Nov 22 04:29:13 crc kubenswrapper[4699]: I1122 04:29:13.138078 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"6b0a42c8-e8a1-45b3-9f29-77459d98ea4d","Type":"ContainerDied","Data":"ed069f697e5beb0b68f6da32dfb9b0f6c439f25643b5daeed8781aec98fd2d56"} Nov 22 04:29:15 crc kubenswrapper[4699]: I1122 04:29:15.443143 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 22 04:29:15 crc kubenswrapper[4699]: I1122 04:29:15.443579 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 22 04:29:15 crc kubenswrapper[4699]: I1122 04:29:15.471621 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 22 04:29:15 crc kubenswrapper[4699]: I1122 04:29:15.482903 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 22 04:29:15 crc kubenswrapper[4699]: I1122 04:29:15.706363 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 22 04:29:15 crc kubenswrapper[4699]: I1122 04:29:15.706417 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 22 04:29:15 crc kubenswrapper[4699]: I1122 04:29:15.738181 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 22 04:29:15 crc kubenswrapper[4699]: I1122 04:29:15.749794 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 22 04:29:16 crc kubenswrapper[4699]: I1122 04:29:16.165269 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 22 04:29:16 crc kubenswrapper[4699]: I1122 04:29:16.165324 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 22 04:29:16 crc kubenswrapper[4699]: I1122 04:29:16.165334 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 22 04:29:16 crc kubenswrapper[4699]: I1122 04:29:16.165344 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 22 04:29:18 crc kubenswrapper[4699]: I1122 04:29:18.109286 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 22 04:29:18 crc kubenswrapper[4699]: I1122 04:29:18.188819 4699 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 22 04:29:18 crc kubenswrapper[4699]: I1122 04:29:18.191275 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 22 04:29:18 crc kubenswrapper[4699]: I1122 04:29:18.195301 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 22 04:29:18 crc kubenswrapper[4699]: I1122 04:29:18.195399 4699 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 22 04:29:18 crc kubenswrapper[4699]: I1122 04:29:18.201464 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 22 04:29:20 crc kubenswrapper[4699]: I1122 04:29:20.221893 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66de844d-ddf5-4823-8292-8611e737acd4","Type":"ContainerStarted","Data":"f466226d35b82f43ebadb0ef3560500f6779ccd51c0be43eb9d227dbd830954d"} Nov 22 04:29:20 crc kubenswrapper[4699]: I1122 04:29:20.222331 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 22 04:29:20 crc kubenswrapper[4699]: I1122 04:29:20.224309 4699 generic.go:334] "Generic (PLEG): container finished" podID="57ead407-5bf6-4cc4-ac17-e939d329f220" containerID="8e1f5667779fa293fc95229d15ce15a9f39614ef08c134571d8b731fa48c5249" exitCode=0 Nov 22 04:29:20 crc kubenswrapper[4699]: I1122 04:29:20.224347 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-sync-kdn8x" event={"ID":"57ead407-5bf6-4cc4-ac17-e939d329f220","Type":"ContainerDied","Data":"8e1f5667779fa293fc95229d15ce15a9f39614ef08c134571d8b731fa48c5249"} Nov 22 04:29:20 crc kubenswrapper[4699]: I1122 04:29:20.258893 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.33079693 podStartE2EDuration="16.258872129s" podCreationTimestamp="2025-11-22 04:29:04 +0000 UTC" firstStartedPulling="2025-11-22 04:29:05.487219747 +0000 UTC m=+1296.829840944" lastFinishedPulling="2025-11-22 04:29:19.415294966 +0000 UTC m=+1310.757916143" observedRunningTime="2025-11-22 04:29:20.251869504 +0000 UTC m=+1311.594490711" watchObservedRunningTime="2025-11-22 04:29:20.258872129 +0000 UTC m=+1311.601493316" Nov 22 04:29:23 crc kubenswrapper[4699]: I1122 04:29:23.023974 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-sync-kdn8x" Nov 22 04:29:23 crc kubenswrapper[4699]: I1122 04:29:23.118747 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/57ead407-5bf6-4cc4-ac17-e939d329f220-etc-podinfo\") pod \"57ead407-5bf6-4cc4-ac17-e939d329f220\" (UID: \"57ead407-5bf6-4cc4-ac17-e939d329f220\") " Nov 22 04:29:23 crc kubenswrapper[4699]: I1122 04:29:23.118821 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/57ead407-5bf6-4cc4-ac17-e939d329f220-config\") pod \"57ead407-5bf6-4cc4-ac17-e939d329f220\" (UID: \"57ead407-5bf6-4cc4-ac17-e939d329f220\") " Nov 22 04:29:23 crc kubenswrapper[4699]: I1122 04:29:23.118868 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57ead407-5bf6-4cc4-ac17-e939d329f220-scripts\") pod \"57ead407-5bf6-4cc4-ac17-e939d329f220\" (UID: \"57ead407-5bf6-4cc4-ac17-e939d329f220\") " Nov 22 04:29:23 crc kubenswrapper[4699]: I1122 04:29:23.118901 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zdsp\" (UniqueName: \"kubernetes.io/projected/57ead407-5bf6-4cc4-ac17-e939d329f220-kube-api-access-5zdsp\") pod \"57ead407-5bf6-4cc4-ac17-e939d329f220\" (UID: \"57ead407-5bf6-4cc4-ac17-e939d329f220\") " Nov 22 04:29:23 crc kubenswrapper[4699]: I1122 04:29:23.120042 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57ead407-5bf6-4cc4-ac17-e939d329f220-combined-ca-bundle\") pod \"57ead407-5bf6-4cc4-ac17-e939d329f220\" (UID: \"57ead407-5bf6-4cc4-ac17-e939d329f220\") " Nov 22 04:29:23 crc kubenswrapper[4699]: I1122 04:29:23.120118 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/57ead407-5bf6-4cc4-ac17-e939d329f220-var-lib-ironic\") pod \"57ead407-5bf6-4cc4-ac17-e939d329f220\" (UID: \"57ead407-5bf6-4cc4-ac17-e939d329f220\") " Nov 22 04:29:23 crc kubenswrapper[4699]: I1122 04:29:23.120144 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/57ead407-5bf6-4cc4-ac17-e939d329f220-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"57ead407-5bf6-4cc4-ac17-e939d329f220\" (UID: \"57ead407-5bf6-4cc4-ac17-e939d329f220\") " Nov 22 04:29:23 crc kubenswrapper[4699]: I1122 04:29:23.120510 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57ead407-5bf6-4cc4-ac17-e939d329f220-var-lib-ironic" (OuterVolumeSpecName: "var-lib-ironic") pod "57ead407-5bf6-4cc4-ac17-e939d329f220" (UID: "57ead407-5bf6-4cc4-ac17-e939d329f220"). InnerVolumeSpecName "var-lib-ironic". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:29:23 crc kubenswrapper[4699]: I1122 04:29:23.120726 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57ead407-5bf6-4cc4-ac17-e939d329f220-var-lib-ironic-inspector-dhcp-hostsdir" (OuterVolumeSpecName: "var-lib-ironic-inspector-dhcp-hostsdir") pod "57ead407-5bf6-4cc4-ac17-e939d329f220" (UID: "57ead407-5bf6-4cc4-ac17-e939d329f220"). InnerVolumeSpecName "var-lib-ironic-inspector-dhcp-hostsdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:29:23 crc kubenswrapper[4699]: I1122 04:29:23.121142 4699 reconciler_common.go:293] "Volume detached for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/57ead407-5bf6-4cc4-ac17-e939d329f220-var-lib-ironic\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:23 crc kubenswrapper[4699]: I1122 04:29:23.121155 4699 reconciler_common.go:293] "Volume detached for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/57ead407-5bf6-4cc4-ac17-e939d329f220-var-lib-ironic-inspector-dhcp-hostsdir\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:23 crc kubenswrapper[4699]: I1122 04:29:23.125532 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/57ead407-5bf6-4cc4-ac17-e939d329f220-etc-podinfo" (OuterVolumeSpecName: "etc-podinfo") pod "57ead407-5bf6-4cc4-ac17-e939d329f220" (UID: "57ead407-5bf6-4cc4-ac17-e939d329f220"). InnerVolumeSpecName "etc-podinfo". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Nov 22 04:29:23 crc kubenswrapper[4699]: I1122 04:29:23.125593 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57ead407-5bf6-4cc4-ac17-e939d329f220-scripts" (OuterVolumeSpecName: "scripts") pod "57ead407-5bf6-4cc4-ac17-e939d329f220" (UID: "57ead407-5bf6-4cc4-ac17-e939d329f220"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:29:23 crc kubenswrapper[4699]: I1122 04:29:23.125725 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57ead407-5bf6-4cc4-ac17-e939d329f220-kube-api-access-5zdsp" (OuterVolumeSpecName: "kube-api-access-5zdsp") pod "57ead407-5bf6-4cc4-ac17-e939d329f220" (UID: "57ead407-5bf6-4cc4-ac17-e939d329f220"). InnerVolumeSpecName "kube-api-access-5zdsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:29:23 crc kubenswrapper[4699]: I1122 04:29:23.159861 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57ead407-5bf6-4cc4-ac17-e939d329f220-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "57ead407-5bf6-4cc4-ac17-e939d329f220" (UID: "57ead407-5bf6-4cc4-ac17-e939d329f220"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:29:23 crc kubenswrapper[4699]: I1122 04:29:23.164604 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57ead407-5bf6-4cc4-ac17-e939d329f220-config" (OuterVolumeSpecName: "config") pod "57ead407-5bf6-4cc4-ac17-e939d329f220" (UID: "57ead407-5bf6-4cc4-ac17-e939d329f220"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:29:23 crc kubenswrapper[4699]: I1122 04:29:23.223732 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57ead407-5bf6-4cc4-ac17-e939d329f220-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:23 crc kubenswrapper[4699]: I1122 04:29:23.223804 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zdsp\" (UniqueName: \"kubernetes.io/projected/57ead407-5bf6-4cc4-ac17-e939d329f220-kube-api-access-5zdsp\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:23 crc kubenswrapper[4699]: I1122 04:29:23.223825 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57ead407-5bf6-4cc4-ac17-e939d329f220-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:23 crc kubenswrapper[4699]: I1122 04:29:23.223862 4699 reconciler_common.go:293] "Volume detached for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/57ead407-5bf6-4cc4-ac17-e939d329f220-etc-podinfo\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:23 crc kubenswrapper[4699]: I1122 04:29:23.223877 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/57ead407-5bf6-4cc4-ac17-e939d329f220-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:23 crc kubenswrapper[4699]: I1122 04:29:23.262402 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-sync-kdn8x" event={"ID":"57ead407-5bf6-4cc4-ac17-e939d329f220","Type":"ContainerDied","Data":"850b4b7ae1c0c871a828822fb8002c58f7c553e12318fc39694bff68ecec7d5b"} Nov 22 04:29:23 crc kubenswrapper[4699]: I1122 04:29:23.262659 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="850b4b7ae1c0c871a828822fb8002c58f7c553e12318fc39694bff68ecec7d5b" Nov 22 04:29:23 crc kubenswrapper[4699]: I1122 04:29:23.262806 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-sync-kdn8x" Nov 22 04:29:24 crc kubenswrapper[4699]: I1122 04:29:24.274370 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"6b0a42c8-e8a1-45b3-9f29-77459d98ea4d","Type":"ContainerStarted","Data":"a61d8c7fa94c040b3d1169b2c17f00afe5d612c8291396061cb0c712611ae221"} Nov 22 04:29:25 crc kubenswrapper[4699]: I1122 04:29:25.255962 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-0"] Nov 22 04:29:25 crc kubenswrapper[4699]: E1122 04:29:25.256354 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57ead407-5bf6-4cc4-ac17-e939d329f220" containerName="ironic-inspector-db-sync" Nov 22 04:29:25 crc kubenswrapper[4699]: I1122 04:29:25.256370 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="57ead407-5bf6-4cc4-ac17-e939d329f220" containerName="ironic-inspector-db-sync" Nov 22 04:29:25 crc kubenswrapper[4699]: I1122 04:29:25.256547 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="57ead407-5bf6-4cc4-ac17-e939d329f220" containerName="ironic-inspector-db-sync" Nov 22 04:29:25 crc kubenswrapper[4699]: I1122 04:29:25.258790 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Nov 22 04:29:25 crc kubenswrapper[4699]: I1122 04:29:25.269190 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-scripts" Nov 22 04:29:25 crc kubenswrapper[4699]: I1122 04:29:25.269755 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-config-data" Nov 22 04:29:25 crc kubenswrapper[4699]: I1122 04:29:25.291033 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-0"] Nov 22 04:29:25 crc kubenswrapper[4699]: I1122 04:29:25.360997 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/045e122d-6d75-4ae6-8c84-e56afecd7028-scripts\") pod \"ironic-inspector-0\" (UID: \"045e122d-6d75-4ae6-8c84-e56afecd7028\") " pod="openstack/ironic-inspector-0" Nov 22 04:29:25 crc kubenswrapper[4699]: I1122 04:29:25.361103 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/045e122d-6d75-4ae6-8c84-e56afecd7028-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"045e122d-6d75-4ae6-8c84-e56afecd7028\") " pod="openstack/ironic-inspector-0" Nov 22 04:29:25 crc kubenswrapper[4699]: I1122 04:29:25.361140 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc88n\" (UniqueName: \"kubernetes.io/projected/045e122d-6d75-4ae6-8c84-e56afecd7028-kube-api-access-fc88n\") pod \"ironic-inspector-0\" (UID: \"045e122d-6d75-4ae6-8c84-e56afecd7028\") " pod="openstack/ironic-inspector-0" Nov 22 04:29:25 crc kubenswrapper[4699]: I1122 04:29:25.361212 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/045e122d-6d75-4ae6-8c84-e56afecd7028-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"045e122d-6d75-4ae6-8c84-e56afecd7028\") " pod="openstack/ironic-inspector-0" Nov 22 04:29:25 crc kubenswrapper[4699]: I1122 04:29:25.361236 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/045e122d-6d75-4ae6-8c84-e56afecd7028-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"045e122d-6d75-4ae6-8c84-e56afecd7028\") " pod="openstack/ironic-inspector-0" Nov 22 04:29:25 crc kubenswrapper[4699]: I1122 04:29:25.361268 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/045e122d-6d75-4ae6-8c84-e56afecd7028-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"045e122d-6d75-4ae6-8c84-e56afecd7028\") " pod="openstack/ironic-inspector-0" Nov 22 04:29:25 crc kubenswrapper[4699]: I1122 04:29:25.361303 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/045e122d-6d75-4ae6-8c84-e56afecd7028-config\") pod \"ironic-inspector-0\" (UID: \"045e122d-6d75-4ae6-8c84-e56afecd7028\") " pod="openstack/ironic-inspector-0" Nov 22 04:29:25 crc kubenswrapper[4699]: I1122 04:29:25.462639 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/045e122d-6d75-4ae6-8c84-e56afecd7028-scripts\") pod \"ironic-inspector-0\" (UID: \"045e122d-6d75-4ae6-8c84-e56afecd7028\") " pod="openstack/ironic-inspector-0" Nov 22 04:29:25 crc kubenswrapper[4699]: I1122 04:29:25.463493 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/045e122d-6d75-4ae6-8c84-e56afecd7028-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"045e122d-6d75-4ae6-8c84-e56afecd7028\") " pod="openstack/ironic-inspector-0" Nov 22 04:29:25 crc kubenswrapper[4699]: I1122 04:29:25.464068 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc88n\" (UniqueName: \"kubernetes.io/projected/045e122d-6d75-4ae6-8c84-e56afecd7028-kube-api-access-fc88n\") pod \"ironic-inspector-0\" (UID: \"045e122d-6d75-4ae6-8c84-e56afecd7028\") " pod="openstack/ironic-inspector-0" Nov 22 04:29:25 crc kubenswrapper[4699]: I1122 04:29:25.464224 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/045e122d-6d75-4ae6-8c84-e56afecd7028-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"045e122d-6d75-4ae6-8c84-e56afecd7028\") " pod="openstack/ironic-inspector-0" Nov 22 04:29:25 crc kubenswrapper[4699]: I1122 04:29:25.464335 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/045e122d-6d75-4ae6-8c84-e56afecd7028-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"045e122d-6d75-4ae6-8c84-e56afecd7028\") " pod="openstack/ironic-inspector-0" Nov 22 04:29:25 crc kubenswrapper[4699]: I1122 04:29:25.464000 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/045e122d-6d75-4ae6-8c84-e56afecd7028-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"045e122d-6d75-4ae6-8c84-e56afecd7028\") " pod="openstack/ironic-inspector-0" Nov 22 04:29:25 crc kubenswrapper[4699]: I1122 04:29:25.465658 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/045e122d-6d75-4ae6-8c84-e56afecd7028-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"045e122d-6d75-4ae6-8c84-e56afecd7028\") " pod="openstack/ironic-inspector-0" Nov 22 04:29:25 crc kubenswrapper[4699]: I1122 04:29:25.470136 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/045e122d-6d75-4ae6-8c84-e56afecd7028-scripts\") pod \"ironic-inspector-0\" (UID: \"045e122d-6d75-4ae6-8c84-e56afecd7028\") " pod="openstack/ironic-inspector-0" Nov 22 04:29:25 crc kubenswrapper[4699]: I1122 04:29:25.471296 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/045e122d-6d75-4ae6-8c84-e56afecd7028-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"045e122d-6d75-4ae6-8c84-e56afecd7028\") " pod="openstack/ironic-inspector-0" Nov 22 04:29:25 crc kubenswrapper[4699]: I1122 04:29:25.471494 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/045e122d-6d75-4ae6-8c84-e56afecd7028-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"045e122d-6d75-4ae6-8c84-e56afecd7028\") " pod="openstack/ironic-inspector-0" Nov 22 04:29:25 crc kubenswrapper[4699]: I1122 04:29:25.471961 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/045e122d-6d75-4ae6-8c84-e56afecd7028-config\") pod \"ironic-inspector-0\" (UID: \"045e122d-6d75-4ae6-8c84-e56afecd7028\") " pod="openstack/ironic-inspector-0" Nov 22 04:29:25 crc kubenswrapper[4699]: I1122 04:29:25.475065 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/045e122d-6d75-4ae6-8c84-e56afecd7028-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"045e122d-6d75-4ae6-8c84-e56afecd7028\") " pod="openstack/ironic-inspector-0" Nov 22 04:29:25 crc kubenswrapper[4699]: I1122 04:29:25.479344 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/045e122d-6d75-4ae6-8c84-e56afecd7028-config\") pod \"ironic-inspector-0\" (UID: \"045e122d-6d75-4ae6-8c84-e56afecd7028\") " pod="openstack/ironic-inspector-0" Nov 22 04:29:25 crc kubenswrapper[4699]: I1122 04:29:25.487011 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc88n\" (UniqueName: \"kubernetes.io/projected/045e122d-6d75-4ae6-8c84-e56afecd7028-kube-api-access-fc88n\") pod \"ironic-inspector-0\" (UID: \"045e122d-6d75-4ae6-8c84-e56afecd7028\") " pod="openstack/ironic-inspector-0" Nov 22 04:29:25 crc kubenswrapper[4699]: I1122 04:29:25.582385 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Nov 22 04:29:26 crc kubenswrapper[4699]: I1122 04:29:26.067942 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-0"] Nov 22 04:29:26 crc kubenswrapper[4699]: I1122 04:29:26.322466 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"045e122d-6d75-4ae6-8c84-e56afecd7028","Type":"ContainerStarted","Data":"627130cabcfe6dfec14d94bba42852d03ef8b6c39a3608132289951ee56f8cfb"} Nov 22 04:29:27 crc kubenswrapper[4699]: I1122 04:29:27.331039 4699 generic.go:334] "Generic (PLEG): container finished" podID="045e122d-6d75-4ae6-8c84-e56afecd7028" containerID="08187343cb6205e2ade5792a4e76a57581d8a358469ef1ed94f8f4bb5c61c805" exitCode=0 Nov 22 04:29:27 crc kubenswrapper[4699]: I1122 04:29:27.331136 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"045e122d-6d75-4ae6-8c84-e56afecd7028","Type":"ContainerDied","Data":"08187343cb6205e2ade5792a4e76a57581d8a358469ef1ed94f8f4bb5c61c805"} Nov 22 04:29:27 crc kubenswrapper[4699]: I1122 04:29:27.878903 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-0"] Nov 22 04:29:28 crc kubenswrapper[4699]: I1122 04:29:28.341151 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"045e122d-6d75-4ae6-8c84-e56afecd7028","Type":"ContainerStarted","Data":"f108f57960e1e5146848f3355722d0bb2c4e07937590eca870e99707cc38c842"} Nov 22 04:29:29 crc kubenswrapper[4699]: I1122 04:29:29.353565 4699 generic.go:334] "Generic (PLEG): container finished" podID="045e122d-6d75-4ae6-8c84-e56afecd7028" containerID="f108f57960e1e5146848f3355722d0bb2c4e07937590eca870e99707cc38c842" exitCode=0 Nov 22 04:29:29 crc kubenswrapper[4699]: I1122 04:29:29.353749 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"045e122d-6d75-4ae6-8c84-e56afecd7028","Type":"ContainerDied","Data":"f108f57960e1e5146848f3355722d0bb2c4e07937590eca870e99707cc38c842"} Nov 22 04:29:29 crc kubenswrapper[4699]: I1122 04:29:29.802660 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Nov 22 04:29:29 crc kubenswrapper[4699]: I1122 04:29:29.969894 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/045e122d-6d75-4ae6-8c84-e56afecd7028-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"045e122d-6d75-4ae6-8c84-e56afecd7028\" (UID: \"045e122d-6d75-4ae6-8c84-e56afecd7028\") " Nov 22 04:29:29 crc kubenswrapper[4699]: I1122 04:29:29.969951 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/045e122d-6d75-4ae6-8c84-e56afecd7028-scripts\") pod \"045e122d-6d75-4ae6-8c84-e56afecd7028\" (UID: \"045e122d-6d75-4ae6-8c84-e56afecd7028\") " Nov 22 04:29:29 crc kubenswrapper[4699]: I1122 04:29:29.969983 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/045e122d-6d75-4ae6-8c84-e56afecd7028-config\") pod \"045e122d-6d75-4ae6-8c84-e56afecd7028\" (UID: \"045e122d-6d75-4ae6-8c84-e56afecd7028\") " Nov 22 04:29:29 crc kubenswrapper[4699]: I1122 04:29:29.970033 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/045e122d-6d75-4ae6-8c84-e56afecd7028-etc-podinfo\") pod \"045e122d-6d75-4ae6-8c84-e56afecd7028\" (UID: \"045e122d-6d75-4ae6-8c84-e56afecd7028\") " Nov 22 04:29:29 crc kubenswrapper[4699]: I1122 04:29:29.970094 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/045e122d-6d75-4ae6-8c84-e56afecd7028-var-lib-ironic\") pod \"045e122d-6d75-4ae6-8c84-e56afecd7028\" (UID: \"045e122d-6d75-4ae6-8c84-e56afecd7028\") " Nov 22 04:29:29 crc kubenswrapper[4699]: I1122 04:29:29.970148 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fc88n\" (UniqueName: \"kubernetes.io/projected/045e122d-6d75-4ae6-8c84-e56afecd7028-kube-api-access-fc88n\") pod \"045e122d-6d75-4ae6-8c84-e56afecd7028\" (UID: \"045e122d-6d75-4ae6-8c84-e56afecd7028\") " Nov 22 04:29:29 crc kubenswrapper[4699]: I1122 04:29:29.970245 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/045e122d-6d75-4ae6-8c84-e56afecd7028-combined-ca-bundle\") pod \"045e122d-6d75-4ae6-8c84-e56afecd7028\" (UID: \"045e122d-6d75-4ae6-8c84-e56afecd7028\") " Nov 22 04:29:29 crc kubenswrapper[4699]: I1122 04:29:29.970238 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/045e122d-6d75-4ae6-8c84-e56afecd7028-var-lib-ironic-inspector-dhcp-hostsdir" (OuterVolumeSpecName: "var-lib-ironic-inspector-dhcp-hostsdir") pod "045e122d-6d75-4ae6-8c84-e56afecd7028" (UID: "045e122d-6d75-4ae6-8c84-e56afecd7028"). InnerVolumeSpecName "var-lib-ironic-inspector-dhcp-hostsdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:29:29 crc kubenswrapper[4699]: I1122 04:29:29.970837 4699 reconciler_common.go:293] "Volume detached for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/045e122d-6d75-4ae6-8c84-e56afecd7028-var-lib-ironic-inspector-dhcp-hostsdir\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:29 crc kubenswrapper[4699]: I1122 04:29:29.974818 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/045e122d-6d75-4ae6-8c84-e56afecd7028-var-lib-ironic" (OuterVolumeSpecName: "var-lib-ironic") pod "045e122d-6d75-4ae6-8c84-e56afecd7028" (UID: "045e122d-6d75-4ae6-8c84-e56afecd7028"). InnerVolumeSpecName "var-lib-ironic". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:29:29 crc kubenswrapper[4699]: I1122 04:29:29.976951 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/045e122d-6d75-4ae6-8c84-e56afecd7028-kube-api-access-fc88n" (OuterVolumeSpecName: "kube-api-access-fc88n") pod "045e122d-6d75-4ae6-8c84-e56afecd7028" (UID: "045e122d-6d75-4ae6-8c84-e56afecd7028"). InnerVolumeSpecName "kube-api-access-fc88n". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:29:29 crc kubenswrapper[4699]: I1122 04:29:29.981596 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/045e122d-6d75-4ae6-8c84-e56afecd7028-etc-podinfo" (OuterVolumeSpecName: "etc-podinfo") pod "045e122d-6d75-4ae6-8c84-e56afecd7028" (UID: "045e122d-6d75-4ae6-8c84-e56afecd7028"). InnerVolumeSpecName "etc-podinfo". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Nov 22 04:29:29 crc kubenswrapper[4699]: I1122 04:29:29.981683 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/045e122d-6d75-4ae6-8c84-e56afecd7028-scripts" (OuterVolumeSpecName: "scripts") pod "045e122d-6d75-4ae6-8c84-e56afecd7028" (UID: "045e122d-6d75-4ae6-8c84-e56afecd7028"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:29:29 crc kubenswrapper[4699]: I1122 04:29:29.999337 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/045e122d-6d75-4ae6-8c84-e56afecd7028-config" (OuterVolumeSpecName: "config") pod "045e122d-6d75-4ae6-8c84-e56afecd7028" (UID: "045e122d-6d75-4ae6-8c84-e56afecd7028"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.032236 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/045e122d-6d75-4ae6-8c84-e56afecd7028-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "045e122d-6d75-4ae6-8c84-e56afecd7028" (UID: "045e122d-6d75-4ae6-8c84-e56afecd7028"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.072802 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/045e122d-6d75-4ae6-8c84-e56afecd7028-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.072852 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/045e122d-6d75-4ae6-8c84-e56afecd7028-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.072868 4699 reconciler_common.go:293] "Volume detached for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/045e122d-6d75-4ae6-8c84-e56afecd7028-etc-podinfo\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.072883 4699 reconciler_common.go:293] "Volume detached for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/045e122d-6d75-4ae6-8c84-e56afecd7028-var-lib-ironic\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.072897 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fc88n\" (UniqueName: \"kubernetes.io/projected/045e122d-6d75-4ae6-8c84-e56afecd7028-kube-api-access-fc88n\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.072911 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/045e122d-6d75-4ae6-8c84-e56afecd7028-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.365263 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"045e122d-6d75-4ae6-8c84-e56afecd7028","Type":"ContainerDied","Data":"627130cabcfe6dfec14d94bba42852d03ef8b6c39a3608132289951ee56f8cfb"} Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.365325 4699 scope.go:117] "RemoveContainer" containerID="f108f57960e1e5146848f3355722d0bb2c4e07937590eca870e99707cc38c842" Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.365357 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.400386 4699 scope.go:117] "RemoveContainer" containerID="08187343cb6205e2ade5792a4e76a57581d8a358469ef1ed94f8f4bb5c61c805" Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.483543 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-0"] Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.514616 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-inspector-0"] Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.528814 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-0"] Nov 22 04:29:30 crc kubenswrapper[4699]: E1122 04:29:30.529534 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="045e122d-6d75-4ae6-8c84-e56afecd7028" containerName="inspector-pxe-init" Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.529551 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="045e122d-6d75-4ae6-8c84-e56afecd7028" containerName="inspector-pxe-init" Nov 22 04:29:30 crc kubenswrapper[4699]: E1122 04:29:30.529569 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="045e122d-6d75-4ae6-8c84-e56afecd7028" containerName="ironic-python-agent-init" Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.529577 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="045e122d-6d75-4ae6-8c84-e56afecd7028" containerName="ironic-python-agent-init" Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.530875 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="045e122d-6d75-4ae6-8c84-e56afecd7028" containerName="inspector-pxe-init" Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.533462 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.537121 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-config-data" Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.537133 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-scripts" Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.537569 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-inspector-public-svc" Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.537816 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-inspector-internal-svc" Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.548820 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-0"] Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.681967 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c8864ff-6365-44c6-8fe0-134c7f25b176-public-tls-certs\") pod \"ironic-inspector-0\" (UID: \"3c8864ff-6365-44c6-8fe0-134c7f25b176\") " pod="openstack/ironic-inspector-0" Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.682022 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c8864ff-6365-44c6-8fe0-134c7f25b176-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"3c8864ff-6365-44c6-8fe0-134c7f25b176\") " pod="openstack/ironic-inspector-0" Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.682071 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/3c8864ff-6365-44c6-8fe0-134c7f25b176-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"3c8864ff-6365-44c6-8fe0-134c7f25b176\") " pod="openstack/ironic-inspector-0" Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.682339 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c8864ff-6365-44c6-8fe0-134c7f25b176-internal-tls-certs\") pod \"ironic-inspector-0\" (UID: \"3c8864ff-6365-44c6-8fe0-134c7f25b176\") " pod="openstack/ironic-inspector-0" Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.682503 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c8864ff-6365-44c6-8fe0-134c7f25b176-scripts\") pod \"ironic-inspector-0\" (UID: \"3c8864ff-6365-44c6-8fe0-134c7f25b176\") " pod="openstack/ironic-inspector-0" Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.682632 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3c8864ff-6365-44c6-8fe0-134c7f25b176-config\") pod \"ironic-inspector-0\" (UID: \"3c8864ff-6365-44c6-8fe0-134c7f25b176\") " pod="openstack/ironic-inspector-0" Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.682686 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk55w\" (UniqueName: \"kubernetes.io/projected/3c8864ff-6365-44c6-8fe0-134c7f25b176-kube-api-access-sk55w\") pod \"ironic-inspector-0\" (UID: \"3c8864ff-6365-44c6-8fe0-134c7f25b176\") " pod="openstack/ironic-inspector-0" Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.682740 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/3c8864ff-6365-44c6-8fe0-134c7f25b176-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"3c8864ff-6365-44c6-8fe0-134c7f25b176\") " pod="openstack/ironic-inspector-0" Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.682760 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/3c8864ff-6365-44c6-8fe0-134c7f25b176-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"3c8864ff-6365-44c6-8fe0-134c7f25b176\") " pod="openstack/ironic-inspector-0" Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.784828 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c8864ff-6365-44c6-8fe0-134c7f25b176-scripts\") pod \"ironic-inspector-0\" (UID: \"3c8864ff-6365-44c6-8fe0-134c7f25b176\") " pod="openstack/ironic-inspector-0" Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.785216 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3c8864ff-6365-44c6-8fe0-134c7f25b176-config\") pod \"ironic-inspector-0\" (UID: \"3c8864ff-6365-44c6-8fe0-134c7f25b176\") " pod="openstack/ironic-inspector-0" Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.785245 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sk55w\" (UniqueName: \"kubernetes.io/projected/3c8864ff-6365-44c6-8fe0-134c7f25b176-kube-api-access-sk55w\") pod \"ironic-inspector-0\" (UID: \"3c8864ff-6365-44c6-8fe0-134c7f25b176\") " pod="openstack/ironic-inspector-0" Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.785268 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/3c8864ff-6365-44c6-8fe0-134c7f25b176-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"3c8864ff-6365-44c6-8fe0-134c7f25b176\") " pod="openstack/ironic-inspector-0" Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.785288 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/3c8864ff-6365-44c6-8fe0-134c7f25b176-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"3c8864ff-6365-44c6-8fe0-134c7f25b176\") " pod="openstack/ironic-inspector-0" Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.785342 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c8864ff-6365-44c6-8fe0-134c7f25b176-public-tls-certs\") pod \"ironic-inspector-0\" (UID: \"3c8864ff-6365-44c6-8fe0-134c7f25b176\") " pod="openstack/ironic-inspector-0" Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.785365 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c8864ff-6365-44c6-8fe0-134c7f25b176-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"3c8864ff-6365-44c6-8fe0-134c7f25b176\") " pod="openstack/ironic-inspector-0" Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.785407 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/3c8864ff-6365-44c6-8fe0-134c7f25b176-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"3c8864ff-6365-44c6-8fe0-134c7f25b176\") " pod="openstack/ironic-inspector-0" Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.785436 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c8864ff-6365-44c6-8fe0-134c7f25b176-internal-tls-certs\") pod \"ironic-inspector-0\" (UID: \"3c8864ff-6365-44c6-8fe0-134c7f25b176\") " pod="openstack/ironic-inspector-0" Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.786486 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/3c8864ff-6365-44c6-8fe0-134c7f25b176-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"3c8864ff-6365-44c6-8fe0-134c7f25b176\") " pod="openstack/ironic-inspector-0" Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.786490 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/3c8864ff-6365-44c6-8fe0-134c7f25b176-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"3c8864ff-6365-44c6-8fe0-134c7f25b176\") " pod="openstack/ironic-inspector-0" Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.789947 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c8864ff-6365-44c6-8fe0-134c7f25b176-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"3c8864ff-6365-44c6-8fe0-134c7f25b176\") " pod="openstack/ironic-inspector-0" Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.791118 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/3c8864ff-6365-44c6-8fe0-134c7f25b176-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"3c8864ff-6365-44c6-8fe0-134c7f25b176\") " pod="openstack/ironic-inspector-0" Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.791210 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c8864ff-6365-44c6-8fe0-134c7f25b176-scripts\") pod \"ironic-inspector-0\" (UID: \"3c8864ff-6365-44c6-8fe0-134c7f25b176\") " pod="openstack/ironic-inspector-0" Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.791289 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c8864ff-6365-44c6-8fe0-134c7f25b176-public-tls-certs\") pod \"ironic-inspector-0\" (UID: \"3c8864ff-6365-44c6-8fe0-134c7f25b176\") " pod="openstack/ironic-inspector-0" Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.795239 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c8864ff-6365-44c6-8fe0-134c7f25b176-internal-tls-certs\") pod \"ironic-inspector-0\" (UID: \"3c8864ff-6365-44c6-8fe0-134c7f25b176\") " pod="openstack/ironic-inspector-0" Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.796157 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3c8864ff-6365-44c6-8fe0-134c7f25b176-config\") pod \"ironic-inspector-0\" (UID: \"3c8864ff-6365-44c6-8fe0-134c7f25b176\") " pod="openstack/ironic-inspector-0" Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.804259 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk55w\" (UniqueName: \"kubernetes.io/projected/3c8864ff-6365-44c6-8fe0-134c7f25b176-kube-api-access-sk55w\") pod \"ironic-inspector-0\" (UID: \"3c8864ff-6365-44c6-8fe0-134c7f25b176\") " pod="openstack/ironic-inspector-0" Nov 22 04:29:30 crc kubenswrapper[4699]: I1122 04:29:30.869045 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Nov 22 04:29:31 crc kubenswrapper[4699]: I1122 04:29:31.308039 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-0"] Nov 22 04:29:31 crc kubenswrapper[4699]: I1122 04:29:31.377220 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"3c8864ff-6365-44c6-8fe0-134c7f25b176","Type":"ContainerStarted","Data":"71f16eb1cbc3d7ffafe76a627b7a21c17cc53971f67758609b32d0c5bc4440a4"} Nov 22 04:29:31 crc kubenswrapper[4699]: I1122 04:29:31.458286 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="045e122d-6d75-4ae6-8c84-e56afecd7028" path="/var/lib/kubelet/pods/045e122d-6d75-4ae6-8c84-e56afecd7028/volumes" Nov 22 04:29:32 crc kubenswrapper[4699]: I1122 04:29:32.397748 4699 generic.go:334] "Generic (PLEG): container finished" podID="3c8864ff-6365-44c6-8fe0-134c7f25b176" containerID="cd5fe3d73bfae92dd9fb96377e929b0bc0bb9fde242de756934eb4bb2c3376bd" exitCode=0 Nov 22 04:29:32 crc kubenswrapper[4699]: I1122 04:29:32.397864 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"3c8864ff-6365-44c6-8fe0-134c7f25b176","Type":"ContainerDied","Data":"cd5fe3d73bfae92dd9fb96377e929b0bc0bb9fde242de756934eb4bb2c3376bd"} Nov 22 04:29:33 crc kubenswrapper[4699]: I1122 04:29:33.296628 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 04:29:33 crc kubenswrapper[4699]: I1122 04:29:33.297196 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="66de844d-ddf5-4823-8292-8611e737acd4" containerName="ceilometer-central-agent" containerID="cri-o://3612d436cf3c59cf5e082b0a36e50bf22168d80ffc9393357d86545635afdabb" gracePeriod=30 Nov 22 04:29:33 crc kubenswrapper[4699]: I1122 04:29:33.297653 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="66de844d-ddf5-4823-8292-8611e737acd4" containerName="proxy-httpd" containerID="cri-o://f466226d35b82f43ebadb0ef3560500f6779ccd51c0be43eb9d227dbd830954d" gracePeriod=30 Nov 22 04:29:33 crc kubenswrapper[4699]: I1122 04:29:33.297704 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="66de844d-ddf5-4823-8292-8611e737acd4" containerName="sg-core" containerID="cri-o://05588c2751ba03288b6e0cce420814e1b4e7334758b23369489af8b03a42e9ca" gracePeriod=30 Nov 22 04:29:33 crc kubenswrapper[4699]: I1122 04:29:33.297744 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="66de844d-ddf5-4823-8292-8611e737acd4" containerName="ceilometer-notification-agent" containerID="cri-o://637112fec54059facbc8586385646b8d619c897f6cc7b19f26c64a673c6a2aac" gracePeriod=30 Nov 22 04:29:33 crc kubenswrapper[4699]: I1122 04:29:33.319137 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="66de844d-ddf5-4823-8292-8611e737acd4" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Nov 22 04:29:33 crc kubenswrapper[4699]: I1122 04:29:33.407202 4699 generic.go:334] "Generic (PLEG): container finished" podID="3c8864ff-6365-44c6-8fe0-134c7f25b176" containerID="8010228b46875d0b63687f5679997d774cafb6bc7a1cae0263f0e005059bd545" exitCode=0 Nov 22 04:29:33 crc kubenswrapper[4699]: I1122 04:29:33.407246 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"3c8864ff-6365-44c6-8fe0-134c7f25b176","Type":"ContainerDied","Data":"8010228b46875d0b63687f5679997d774cafb6bc7a1cae0263f0e005059bd545"} Nov 22 04:29:34 crc kubenswrapper[4699]: I1122 04:29:34.420381 4699 generic.go:334] "Generic (PLEG): container finished" podID="66de844d-ddf5-4823-8292-8611e737acd4" containerID="f466226d35b82f43ebadb0ef3560500f6779ccd51c0be43eb9d227dbd830954d" exitCode=0 Nov 22 04:29:34 crc kubenswrapper[4699]: I1122 04:29:34.420994 4699 generic.go:334] "Generic (PLEG): container finished" podID="66de844d-ddf5-4823-8292-8611e737acd4" containerID="05588c2751ba03288b6e0cce420814e1b4e7334758b23369489af8b03a42e9ca" exitCode=2 Nov 22 04:29:34 crc kubenswrapper[4699]: I1122 04:29:34.421010 4699 generic.go:334] "Generic (PLEG): container finished" podID="66de844d-ddf5-4823-8292-8611e737acd4" containerID="3612d436cf3c59cf5e082b0a36e50bf22168d80ffc9393357d86545635afdabb" exitCode=0 Nov 22 04:29:34 crc kubenswrapper[4699]: I1122 04:29:34.420844 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66de844d-ddf5-4823-8292-8611e737acd4","Type":"ContainerDied","Data":"f466226d35b82f43ebadb0ef3560500f6779ccd51c0be43eb9d227dbd830954d"} Nov 22 04:29:34 crc kubenswrapper[4699]: I1122 04:29:34.421085 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66de844d-ddf5-4823-8292-8611e737acd4","Type":"ContainerDied","Data":"05588c2751ba03288b6e0cce420814e1b4e7334758b23369489af8b03a42e9ca"} Nov 22 04:29:34 crc kubenswrapper[4699]: I1122 04:29:34.421104 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66de844d-ddf5-4823-8292-8611e737acd4","Type":"ContainerDied","Data":"3612d436cf3c59cf5e082b0a36e50bf22168d80ffc9393357d86545635afdabb"} Nov 22 04:29:34 crc kubenswrapper[4699]: I1122 04:29:34.429114 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"3c8864ff-6365-44c6-8fe0-134c7f25b176","Type":"ContainerStarted","Data":"ea5eb73ec943c19f95b4c74c90ba8f0193f1c24d6fc802b5a835925030f35926"} Nov 22 04:29:34 crc kubenswrapper[4699]: I1122 04:29:34.429164 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"3c8864ff-6365-44c6-8fe0-134c7f25b176","Type":"ContainerStarted","Data":"bab38c21d8c9753423353a97fdf969c13650491029b8fdaaaca80fcecc6651b5"} Nov 22 04:29:34 crc kubenswrapper[4699]: I1122 04:29:34.905774 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="66de844d-ddf5-4823-8292-8611e737acd4" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.185:3000/\": dial tcp 10.217.0.185:3000: connect: connection refused" Nov 22 04:29:35 crc kubenswrapper[4699]: I1122 04:29:35.440480 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"3c8864ff-6365-44c6-8fe0-134c7f25b176","Type":"ContainerStarted","Data":"4de71c7c2aa0671c8675e0e52b9f9daf9e10973daf67134de766c83c8556cc6d"} Nov 22 04:29:35 crc kubenswrapper[4699]: I1122 04:29:35.440899 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-inspector-0" Nov 22 04:29:35 crc kubenswrapper[4699]: I1122 04:29:35.440913 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"3c8864ff-6365-44c6-8fe0-134c7f25b176","Type":"ContainerStarted","Data":"932fadf654dbdf5b42bafc0b1746caf4adfc501585b0457024d95b67f82841e0"} Nov 22 04:29:35 crc kubenswrapper[4699]: I1122 04:29:35.442162 4699 generic.go:334] "Generic (PLEG): container finished" podID="2543fbe9-12e0-40d5-8474-dab6ed3144be" containerID="624b463938f550cc17b4e670673bf66a20b8b7cf86c27c5f9a60406c37b18761" exitCode=0 Nov 22 04:29:35 crc kubenswrapper[4699]: I1122 04:29:35.442240 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-wjmlk" event={"ID":"2543fbe9-12e0-40d5-8474-dab6ed3144be","Type":"ContainerDied","Data":"624b463938f550cc17b4e670673bf66a20b8b7cf86c27c5f9a60406c37b18761"} Nov 22 04:29:35 crc kubenswrapper[4699]: I1122 04:29:35.471278 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-inspector-0" podStartSLOduration=5.471258919 podStartE2EDuration="5.471258919s" podCreationTimestamp="2025-11-22 04:29:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:29:35.465152285 +0000 UTC m=+1326.807773482" watchObservedRunningTime="2025-11-22 04:29:35.471258919 +0000 UTC m=+1326.813880106" Nov 22 04:29:35 crc kubenswrapper[4699]: I1122 04:29:35.869943 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-inspector-0" Nov 22 04:29:35 crc kubenswrapper[4699]: I1122 04:29:35.869986 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-inspector-0" Nov 22 04:29:36 crc kubenswrapper[4699]: I1122 04:29:36.873054 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-wjmlk" Nov 22 04:29:36 crc kubenswrapper[4699]: I1122 04:29:36.921821 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2543fbe9-12e0-40d5-8474-dab6ed3144be-config-data\") pod \"2543fbe9-12e0-40d5-8474-dab6ed3144be\" (UID: \"2543fbe9-12e0-40d5-8474-dab6ed3144be\") " Nov 22 04:29:36 crc kubenswrapper[4699]: I1122 04:29:36.921893 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fl98\" (UniqueName: \"kubernetes.io/projected/2543fbe9-12e0-40d5-8474-dab6ed3144be-kube-api-access-9fl98\") pod \"2543fbe9-12e0-40d5-8474-dab6ed3144be\" (UID: \"2543fbe9-12e0-40d5-8474-dab6ed3144be\") " Nov 22 04:29:36 crc kubenswrapper[4699]: I1122 04:29:36.922232 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2543fbe9-12e0-40d5-8474-dab6ed3144be-combined-ca-bundle\") pod \"2543fbe9-12e0-40d5-8474-dab6ed3144be\" (UID: \"2543fbe9-12e0-40d5-8474-dab6ed3144be\") " Nov 22 04:29:36 crc kubenswrapper[4699]: I1122 04:29:36.922281 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2543fbe9-12e0-40d5-8474-dab6ed3144be-scripts\") pod \"2543fbe9-12e0-40d5-8474-dab6ed3144be\" (UID: \"2543fbe9-12e0-40d5-8474-dab6ed3144be\") " Nov 22 04:29:36 crc kubenswrapper[4699]: I1122 04:29:36.928638 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2543fbe9-12e0-40d5-8474-dab6ed3144be-scripts" (OuterVolumeSpecName: "scripts") pod "2543fbe9-12e0-40d5-8474-dab6ed3144be" (UID: "2543fbe9-12e0-40d5-8474-dab6ed3144be"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:29:36 crc kubenswrapper[4699]: I1122 04:29:36.930399 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2543fbe9-12e0-40d5-8474-dab6ed3144be-kube-api-access-9fl98" (OuterVolumeSpecName: "kube-api-access-9fl98") pod "2543fbe9-12e0-40d5-8474-dab6ed3144be" (UID: "2543fbe9-12e0-40d5-8474-dab6ed3144be"). InnerVolumeSpecName "kube-api-access-9fl98". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:29:36 crc kubenswrapper[4699]: I1122 04:29:36.960848 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2543fbe9-12e0-40d5-8474-dab6ed3144be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2543fbe9-12e0-40d5-8474-dab6ed3144be" (UID: "2543fbe9-12e0-40d5-8474-dab6ed3144be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:29:36 crc kubenswrapper[4699]: I1122 04:29:36.990556 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2543fbe9-12e0-40d5-8474-dab6ed3144be-config-data" (OuterVolumeSpecName: "config-data") pod "2543fbe9-12e0-40d5-8474-dab6ed3144be" (UID: "2543fbe9-12e0-40d5-8474-dab6ed3144be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.024599 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2543fbe9-12e0-40d5-8474-dab6ed3144be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.024626 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2543fbe9-12e0-40d5-8474-dab6ed3144be-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.024637 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2543fbe9-12e0-40d5-8474-dab6ed3144be-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.024646 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fl98\" (UniqueName: \"kubernetes.io/projected/2543fbe9-12e0-40d5-8474-dab6ed3144be-kube-api-access-9fl98\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.230703 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.328954 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66de844d-ddf5-4823-8292-8611e737acd4-log-httpd\") pod \"66de844d-ddf5-4823-8292-8611e737acd4\" (UID: \"66de844d-ddf5-4823-8292-8611e737acd4\") " Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.329030 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/66de844d-ddf5-4823-8292-8611e737acd4-sg-core-conf-yaml\") pod \"66de844d-ddf5-4823-8292-8611e737acd4\" (UID: \"66de844d-ddf5-4823-8292-8611e737acd4\") " Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.329102 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccnvr\" (UniqueName: \"kubernetes.io/projected/66de844d-ddf5-4823-8292-8611e737acd4-kube-api-access-ccnvr\") pod \"66de844d-ddf5-4823-8292-8611e737acd4\" (UID: \"66de844d-ddf5-4823-8292-8611e737acd4\") " Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.329182 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66de844d-ddf5-4823-8292-8611e737acd4-scripts\") pod \"66de844d-ddf5-4823-8292-8611e737acd4\" (UID: \"66de844d-ddf5-4823-8292-8611e737acd4\") " Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.329357 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66de844d-ddf5-4823-8292-8611e737acd4-combined-ca-bundle\") pod \"66de844d-ddf5-4823-8292-8611e737acd4\" (UID: \"66de844d-ddf5-4823-8292-8611e737acd4\") " Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.329429 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66de844d-ddf5-4823-8292-8611e737acd4-run-httpd\") pod \"66de844d-ddf5-4823-8292-8611e737acd4\" (UID: \"66de844d-ddf5-4823-8292-8611e737acd4\") " Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.329523 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66de844d-ddf5-4823-8292-8611e737acd4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "66de844d-ddf5-4823-8292-8611e737acd4" (UID: "66de844d-ddf5-4823-8292-8611e737acd4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.329612 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66de844d-ddf5-4823-8292-8611e737acd4-config-data\") pod \"66de844d-ddf5-4823-8292-8611e737acd4\" (UID: \"66de844d-ddf5-4823-8292-8611e737acd4\") " Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.329939 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66de844d-ddf5-4823-8292-8611e737acd4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "66de844d-ddf5-4823-8292-8611e737acd4" (UID: "66de844d-ddf5-4823-8292-8611e737acd4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.330103 4699 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66de844d-ddf5-4823-8292-8611e737acd4-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.330127 4699 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66de844d-ddf5-4823-8292-8611e737acd4-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.337650 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66de844d-ddf5-4823-8292-8611e737acd4-kube-api-access-ccnvr" (OuterVolumeSpecName: "kube-api-access-ccnvr") pod "66de844d-ddf5-4823-8292-8611e737acd4" (UID: "66de844d-ddf5-4823-8292-8611e737acd4"). InnerVolumeSpecName "kube-api-access-ccnvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.350669 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66de844d-ddf5-4823-8292-8611e737acd4-scripts" (OuterVolumeSpecName: "scripts") pod "66de844d-ddf5-4823-8292-8611e737acd4" (UID: "66de844d-ddf5-4823-8292-8611e737acd4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.360834 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66de844d-ddf5-4823-8292-8611e737acd4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "66de844d-ddf5-4823-8292-8611e737acd4" (UID: "66de844d-ddf5-4823-8292-8611e737acd4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.431544 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66de844d-ddf5-4823-8292-8611e737acd4-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.431577 4699 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/66de844d-ddf5-4823-8292-8611e737acd4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.431591 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccnvr\" (UniqueName: \"kubernetes.io/projected/66de844d-ddf5-4823-8292-8611e737acd4-kube-api-access-ccnvr\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.431751 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66de844d-ddf5-4823-8292-8611e737acd4-config-data" (OuterVolumeSpecName: "config-data") pod "66de844d-ddf5-4823-8292-8611e737acd4" (UID: "66de844d-ddf5-4823-8292-8611e737acd4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.465025 4699 generic.go:334] "Generic (PLEG): container finished" podID="66de844d-ddf5-4823-8292-8611e737acd4" containerID="637112fec54059facbc8586385646b8d619c897f6cc7b19f26c64a673c6a2aac" exitCode=0 Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.465115 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.470435 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-wjmlk" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.532722 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66de844d-ddf5-4823-8292-8611e737acd4-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.538433 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66de844d-ddf5-4823-8292-8611e737acd4","Type":"ContainerDied","Data":"637112fec54059facbc8586385646b8d619c897f6cc7b19f26c64a673c6a2aac"} Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.538500 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66de844d-ddf5-4823-8292-8611e737acd4","Type":"ContainerDied","Data":"826c0c30c91b03d60a5a8d697c887448322e253aff9d5cf44e13570f83c1cffe"} Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.538515 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-wjmlk" event={"ID":"2543fbe9-12e0-40d5-8474-dab6ed3144be","Type":"ContainerDied","Data":"d82b671dbea9378a40c53b750ba6eec4a341d49dc8f0555d825aea1d4737829c"} Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.538532 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d82b671dbea9378a40c53b750ba6eec4a341d49dc8f0555d825aea1d4737829c" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.538550 4699 scope.go:117] "RemoveContainer" containerID="f466226d35b82f43ebadb0ef3560500f6779ccd51c0be43eb9d227dbd830954d" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.565682 4699 scope.go:117] "RemoveContainer" containerID="05588c2751ba03288b6e0cce420814e1b4e7334758b23369489af8b03a42e9ca" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.580702 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66de844d-ddf5-4823-8292-8611e737acd4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "66de844d-ddf5-4823-8292-8611e737acd4" (UID: "66de844d-ddf5-4823-8292-8611e737acd4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.609209 4699 scope.go:117] "RemoveContainer" containerID="637112fec54059facbc8586385646b8d619c897f6cc7b19f26c64a673c6a2aac" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.629031 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 22 04:29:37 crc kubenswrapper[4699]: E1122 04:29:37.629569 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66de844d-ddf5-4823-8292-8611e737acd4" containerName="ceilometer-central-agent" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.629591 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="66de844d-ddf5-4823-8292-8611e737acd4" containerName="ceilometer-central-agent" Nov 22 04:29:37 crc kubenswrapper[4699]: E1122 04:29:37.629621 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2543fbe9-12e0-40d5-8474-dab6ed3144be" containerName="nova-cell0-conductor-db-sync" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.629629 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="2543fbe9-12e0-40d5-8474-dab6ed3144be" containerName="nova-cell0-conductor-db-sync" Nov 22 04:29:37 crc kubenswrapper[4699]: E1122 04:29:37.629645 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66de844d-ddf5-4823-8292-8611e737acd4" containerName="sg-core" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.629653 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="66de844d-ddf5-4823-8292-8611e737acd4" containerName="sg-core" Nov 22 04:29:37 crc kubenswrapper[4699]: E1122 04:29:37.629667 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66de844d-ddf5-4823-8292-8611e737acd4" containerName="proxy-httpd" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.629674 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="66de844d-ddf5-4823-8292-8611e737acd4" containerName="proxy-httpd" Nov 22 04:29:37 crc kubenswrapper[4699]: E1122 04:29:37.629704 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66de844d-ddf5-4823-8292-8611e737acd4" containerName="ceilometer-notification-agent" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.629711 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="66de844d-ddf5-4823-8292-8611e737acd4" containerName="ceilometer-notification-agent" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.629958 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="66de844d-ddf5-4823-8292-8611e737acd4" containerName="ceilometer-central-agent" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.629978 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="66de844d-ddf5-4823-8292-8611e737acd4" containerName="sg-core" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.630005 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="66de844d-ddf5-4823-8292-8611e737acd4" containerName="proxy-httpd" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.630014 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="2543fbe9-12e0-40d5-8474-dab6ed3144be" containerName="nova-cell0-conductor-db-sync" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.630028 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="66de844d-ddf5-4823-8292-8611e737acd4" containerName="ceilometer-notification-agent" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.630780 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.634637 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.635008 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcb6d\" (UniqueName: \"kubernetes.io/projected/8624913e-b73b-41b8-ac5e-64d9114de859-kube-api-access-dcb6d\") pod \"nova-cell0-conductor-0\" (UID: \"8624913e-b73b-41b8-ac5e-64d9114de859\") " pod="openstack/nova-cell0-conductor-0" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.635109 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8624913e-b73b-41b8-ac5e-64d9114de859-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"8624913e-b73b-41b8-ac5e-64d9114de859\") " pod="openstack/nova-cell0-conductor-0" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.635180 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8624913e-b73b-41b8-ac5e-64d9114de859-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"8624913e-b73b-41b8-ac5e-64d9114de859\") " pod="openstack/nova-cell0-conductor-0" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.635255 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66de844d-ddf5-4823-8292-8611e737acd4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.636042 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-vvxqx" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.639710 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.659878 4699 scope.go:117] "RemoveContainer" containerID="3612d436cf3c59cf5e082b0a36e50bf22168d80ffc9393357d86545635afdabb" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.726268 4699 scope.go:117] "RemoveContainer" containerID="f466226d35b82f43ebadb0ef3560500f6779ccd51c0be43eb9d227dbd830954d" Nov 22 04:29:37 crc kubenswrapper[4699]: E1122 04:29:37.726707 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f466226d35b82f43ebadb0ef3560500f6779ccd51c0be43eb9d227dbd830954d\": container with ID starting with f466226d35b82f43ebadb0ef3560500f6779ccd51c0be43eb9d227dbd830954d not found: ID does not exist" containerID="f466226d35b82f43ebadb0ef3560500f6779ccd51c0be43eb9d227dbd830954d" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.726758 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f466226d35b82f43ebadb0ef3560500f6779ccd51c0be43eb9d227dbd830954d"} err="failed to get container status \"f466226d35b82f43ebadb0ef3560500f6779ccd51c0be43eb9d227dbd830954d\": rpc error: code = NotFound desc = could not find container \"f466226d35b82f43ebadb0ef3560500f6779ccd51c0be43eb9d227dbd830954d\": container with ID starting with f466226d35b82f43ebadb0ef3560500f6779ccd51c0be43eb9d227dbd830954d not found: ID does not exist" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.726801 4699 scope.go:117] "RemoveContainer" containerID="05588c2751ba03288b6e0cce420814e1b4e7334758b23369489af8b03a42e9ca" Nov 22 04:29:37 crc kubenswrapper[4699]: E1122 04:29:37.727091 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05588c2751ba03288b6e0cce420814e1b4e7334758b23369489af8b03a42e9ca\": container with ID starting with 05588c2751ba03288b6e0cce420814e1b4e7334758b23369489af8b03a42e9ca not found: ID does not exist" containerID="05588c2751ba03288b6e0cce420814e1b4e7334758b23369489af8b03a42e9ca" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.727116 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05588c2751ba03288b6e0cce420814e1b4e7334758b23369489af8b03a42e9ca"} err="failed to get container status \"05588c2751ba03288b6e0cce420814e1b4e7334758b23369489af8b03a42e9ca\": rpc error: code = NotFound desc = could not find container \"05588c2751ba03288b6e0cce420814e1b4e7334758b23369489af8b03a42e9ca\": container with ID starting with 05588c2751ba03288b6e0cce420814e1b4e7334758b23369489af8b03a42e9ca not found: ID does not exist" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.727133 4699 scope.go:117] "RemoveContainer" containerID="637112fec54059facbc8586385646b8d619c897f6cc7b19f26c64a673c6a2aac" Nov 22 04:29:37 crc kubenswrapper[4699]: E1122 04:29:37.727381 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"637112fec54059facbc8586385646b8d619c897f6cc7b19f26c64a673c6a2aac\": container with ID starting with 637112fec54059facbc8586385646b8d619c897f6cc7b19f26c64a673c6a2aac not found: ID does not exist" containerID="637112fec54059facbc8586385646b8d619c897f6cc7b19f26c64a673c6a2aac" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.727561 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"637112fec54059facbc8586385646b8d619c897f6cc7b19f26c64a673c6a2aac"} err="failed to get container status \"637112fec54059facbc8586385646b8d619c897f6cc7b19f26c64a673c6a2aac\": rpc error: code = NotFound desc = could not find container \"637112fec54059facbc8586385646b8d619c897f6cc7b19f26c64a673c6a2aac\": container with ID starting with 637112fec54059facbc8586385646b8d619c897f6cc7b19f26c64a673c6a2aac not found: ID does not exist" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.727647 4699 scope.go:117] "RemoveContainer" containerID="3612d436cf3c59cf5e082b0a36e50bf22168d80ffc9393357d86545635afdabb" Nov 22 04:29:37 crc kubenswrapper[4699]: E1122 04:29:37.727927 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3612d436cf3c59cf5e082b0a36e50bf22168d80ffc9393357d86545635afdabb\": container with ID starting with 3612d436cf3c59cf5e082b0a36e50bf22168d80ffc9393357d86545635afdabb not found: ID does not exist" containerID="3612d436cf3c59cf5e082b0a36e50bf22168d80ffc9393357d86545635afdabb" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.727954 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3612d436cf3c59cf5e082b0a36e50bf22168d80ffc9393357d86545635afdabb"} err="failed to get container status \"3612d436cf3c59cf5e082b0a36e50bf22168d80ffc9393357d86545635afdabb\": rpc error: code = NotFound desc = could not find container \"3612d436cf3c59cf5e082b0a36e50bf22168d80ffc9393357d86545635afdabb\": container with ID starting with 3612d436cf3c59cf5e082b0a36e50bf22168d80ffc9393357d86545635afdabb not found: ID does not exist" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.736759 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8624913e-b73b-41b8-ac5e-64d9114de859-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"8624913e-b73b-41b8-ac5e-64d9114de859\") " pod="openstack/nova-cell0-conductor-0" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.737013 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8624913e-b73b-41b8-ac5e-64d9114de859-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"8624913e-b73b-41b8-ac5e-64d9114de859\") " pod="openstack/nova-cell0-conductor-0" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.737174 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcb6d\" (UniqueName: \"kubernetes.io/projected/8624913e-b73b-41b8-ac5e-64d9114de859-kube-api-access-dcb6d\") pod \"nova-cell0-conductor-0\" (UID: \"8624913e-b73b-41b8-ac5e-64d9114de859\") " pod="openstack/nova-cell0-conductor-0" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.741350 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8624913e-b73b-41b8-ac5e-64d9114de859-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"8624913e-b73b-41b8-ac5e-64d9114de859\") " pod="openstack/nova-cell0-conductor-0" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.741541 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8624913e-b73b-41b8-ac5e-64d9114de859-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"8624913e-b73b-41b8-ac5e-64d9114de859\") " pod="openstack/nova-cell0-conductor-0" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.753180 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcb6d\" (UniqueName: \"kubernetes.io/projected/8624913e-b73b-41b8-ac5e-64d9114de859-kube-api-access-dcb6d\") pod \"nova-cell0-conductor-0\" (UID: \"8624913e-b73b-41b8-ac5e-64d9114de859\") " pod="openstack/nova-cell0-conductor-0" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.801162 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.815661 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.830357 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.833585 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.838524 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k575c\" (UniqueName: \"kubernetes.io/projected/6b8dfd21-2d6c-4e46-ac51-75888ce472c4-kube-api-access-k575c\") pod \"ceilometer-0\" (UID: \"6b8dfd21-2d6c-4e46-ac51-75888ce472c4\") " pod="openstack/ceilometer-0" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.838567 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b8dfd21-2d6c-4e46-ac51-75888ce472c4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6b8dfd21-2d6c-4e46-ac51-75888ce472c4\") " pod="openstack/ceilometer-0" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.838643 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.838712 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b8dfd21-2d6c-4e46-ac51-75888ce472c4-log-httpd\") pod \"ceilometer-0\" (UID: \"6b8dfd21-2d6c-4e46-ac51-75888ce472c4\") " pod="openstack/ceilometer-0" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.838755 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b8dfd21-2d6c-4e46-ac51-75888ce472c4-config-data\") pod \"ceilometer-0\" (UID: \"6b8dfd21-2d6c-4e46-ac51-75888ce472c4\") " pod="openstack/ceilometer-0" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.838771 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b8dfd21-2d6c-4e46-ac51-75888ce472c4-run-httpd\") pod \"ceilometer-0\" (UID: \"6b8dfd21-2d6c-4e46-ac51-75888ce472c4\") " pod="openstack/ceilometer-0" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.838819 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6b8dfd21-2d6c-4e46-ac51-75888ce472c4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6b8dfd21-2d6c-4e46-ac51-75888ce472c4\") " pod="openstack/ceilometer-0" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.838837 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b8dfd21-2d6c-4e46-ac51-75888ce472c4-scripts\") pod \"ceilometer-0\" (UID: \"6b8dfd21-2d6c-4e46-ac51-75888ce472c4\") " pod="openstack/ceilometer-0" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.838958 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.866947 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.941756 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k575c\" (UniqueName: \"kubernetes.io/projected/6b8dfd21-2d6c-4e46-ac51-75888ce472c4-kube-api-access-k575c\") pod \"ceilometer-0\" (UID: \"6b8dfd21-2d6c-4e46-ac51-75888ce472c4\") " pod="openstack/ceilometer-0" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.941827 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b8dfd21-2d6c-4e46-ac51-75888ce472c4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6b8dfd21-2d6c-4e46-ac51-75888ce472c4\") " pod="openstack/ceilometer-0" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.942073 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b8dfd21-2d6c-4e46-ac51-75888ce472c4-log-httpd\") pod \"ceilometer-0\" (UID: \"6b8dfd21-2d6c-4e46-ac51-75888ce472c4\") " pod="openstack/ceilometer-0" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.942117 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b8dfd21-2d6c-4e46-ac51-75888ce472c4-config-data\") pod \"ceilometer-0\" (UID: \"6b8dfd21-2d6c-4e46-ac51-75888ce472c4\") " pod="openstack/ceilometer-0" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.942142 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b8dfd21-2d6c-4e46-ac51-75888ce472c4-run-httpd\") pod \"ceilometer-0\" (UID: \"6b8dfd21-2d6c-4e46-ac51-75888ce472c4\") " pod="openstack/ceilometer-0" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.942182 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6b8dfd21-2d6c-4e46-ac51-75888ce472c4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6b8dfd21-2d6c-4e46-ac51-75888ce472c4\") " pod="openstack/ceilometer-0" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.942204 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b8dfd21-2d6c-4e46-ac51-75888ce472c4-scripts\") pod \"ceilometer-0\" (UID: \"6b8dfd21-2d6c-4e46-ac51-75888ce472c4\") " pod="openstack/ceilometer-0" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.943522 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b8dfd21-2d6c-4e46-ac51-75888ce472c4-log-httpd\") pod \"ceilometer-0\" (UID: \"6b8dfd21-2d6c-4e46-ac51-75888ce472c4\") " pod="openstack/ceilometer-0" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.943534 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b8dfd21-2d6c-4e46-ac51-75888ce472c4-run-httpd\") pod \"ceilometer-0\" (UID: \"6b8dfd21-2d6c-4e46-ac51-75888ce472c4\") " pod="openstack/ceilometer-0" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.947367 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b8dfd21-2d6c-4e46-ac51-75888ce472c4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6b8dfd21-2d6c-4e46-ac51-75888ce472c4\") " pod="openstack/ceilometer-0" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.947377 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b8dfd21-2d6c-4e46-ac51-75888ce472c4-config-data\") pod \"ceilometer-0\" (UID: \"6b8dfd21-2d6c-4e46-ac51-75888ce472c4\") " pod="openstack/ceilometer-0" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.948016 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6b8dfd21-2d6c-4e46-ac51-75888ce472c4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6b8dfd21-2d6c-4e46-ac51-75888ce472c4\") " pod="openstack/ceilometer-0" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.953104 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b8dfd21-2d6c-4e46-ac51-75888ce472c4-scripts\") pod \"ceilometer-0\" (UID: \"6b8dfd21-2d6c-4e46-ac51-75888ce472c4\") " pod="openstack/ceilometer-0" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.953259 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 22 04:29:37 crc kubenswrapper[4699]: I1122 04:29:37.962892 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k575c\" (UniqueName: \"kubernetes.io/projected/6b8dfd21-2d6c-4e46-ac51-75888ce472c4-kube-api-access-k575c\") pod \"ceilometer-0\" (UID: \"6b8dfd21-2d6c-4e46-ac51-75888ce472c4\") " pod="openstack/ceilometer-0" Nov 22 04:29:38 crc kubenswrapper[4699]: I1122 04:29:38.154717 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 04:29:38 crc kubenswrapper[4699]: I1122 04:29:38.474377 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 22 04:29:38 crc kubenswrapper[4699]: W1122 04:29:38.477201 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8624913e_b73b_41b8_ac5e_64d9114de859.slice/crio-c5efaa2893a088e076cc7f61533017c255e9a5bd1c363b60b59a18c7cbc64592 WatchSource:0}: Error finding container c5efaa2893a088e076cc7f61533017c255e9a5bd1c363b60b59a18c7cbc64592: Status 404 returned error can't find the container with id c5efaa2893a088e076cc7f61533017c255e9a5bd1c363b60b59a18c7cbc64592 Nov 22 04:29:38 crc kubenswrapper[4699]: I1122 04:29:38.611694 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 04:29:39 crc kubenswrapper[4699]: I1122 04:29:39.465244 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66de844d-ddf5-4823-8292-8611e737acd4" path="/var/lib/kubelet/pods/66de844d-ddf5-4823-8292-8611e737acd4/volumes" Nov 22 04:29:39 crc kubenswrapper[4699]: I1122 04:29:39.499786 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"8624913e-b73b-41b8-ac5e-64d9114de859","Type":"ContainerStarted","Data":"f16a462348e02da8352e514297d8df50db21c8f865686915c72721cc13703acf"} Nov 22 04:29:39 crc kubenswrapper[4699]: I1122 04:29:39.499837 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"8624913e-b73b-41b8-ac5e-64d9114de859","Type":"ContainerStarted","Data":"c5efaa2893a088e076cc7f61533017c255e9a5bd1c363b60b59a18c7cbc64592"} Nov 22 04:29:39 crc kubenswrapper[4699]: I1122 04:29:39.500393 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Nov 22 04:29:39 crc kubenswrapper[4699]: I1122 04:29:39.502456 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b8dfd21-2d6c-4e46-ac51-75888ce472c4","Type":"ContainerStarted","Data":"fba07e7dc8810cbcd10e718beb31e93e165755913086d5baa9b1c2c0256a0f93"} Nov 22 04:29:39 crc kubenswrapper[4699]: I1122 04:29:39.520296 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.5202797009999998 podStartE2EDuration="2.520279701s" podCreationTimestamp="2025-11-22 04:29:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:29:39.512674892 +0000 UTC m=+1330.855296089" watchObservedRunningTime="2025-11-22 04:29:39.520279701 +0000 UTC m=+1330.862900888" Nov 22 04:29:40 crc kubenswrapper[4699]: I1122 04:29:40.513581 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b8dfd21-2d6c-4e46-ac51-75888ce472c4","Type":"ContainerStarted","Data":"8a1d3ff60c0447fa81bd5fd576b08ecf3cb01f27126b575e4c457e96789d875c"} Nov 22 04:29:40 crc kubenswrapper[4699]: I1122 04:29:40.869805 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ironic-inspector-0" Nov 22 04:29:40 crc kubenswrapper[4699]: I1122 04:29:40.870115 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ironic-inspector-0" Nov 22 04:29:40 crc kubenswrapper[4699]: I1122 04:29:40.871393 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-inspector-0" Nov 22 04:29:40 crc kubenswrapper[4699]: I1122 04:29:40.959021 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ironic-inspector-0" Nov 22 04:29:40 crc kubenswrapper[4699]: I1122 04:29:40.961627 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ironic-inspector-0" Nov 22 04:29:41 crc kubenswrapper[4699]: I1122 04:29:41.522112 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b8dfd21-2d6c-4e46-ac51-75888ce472c4","Type":"ContainerStarted","Data":"7f37a5a81082c33bf0967b5e8816f8f47b8e9468a1bd24edf880dcc68fd69b8b"} Nov 22 04:29:41 crc kubenswrapper[4699]: I1122 04:29:41.529405 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-inspector-0" Nov 22 04:29:41 crc kubenswrapper[4699]: I1122 04:29:41.533956 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-inspector-0" Nov 22 04:29:42 crc kubenswrapper[4699]: I1122 04:29:42.533766 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b8dfd21-2d6c-4e46-ac51-75888ce472c4","Type":"ContainerStarted","Data":"624b3ed34e207030740b442941eaafeb37bee8c2c7bd7356ea9b503fff6630ee"} Nov 22 04:29:43 crc kubenswrapper[4699]: I1122 04:29:43.544271 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b8dfd21-2d6c-4e46-ac51-75888ce472c4","Type":"ContainerStarted","Data":"9c53850d7831be45a1256cd05431c371d93b9830c5944b7812b3509fe972186a"} Nov 22 04:29:43 crc kubenswrapper[4699]: I1122 04:29:43.545017 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 22 04:29:43 crc kubenswrapper[4699]: I1122 04:29:43.567499 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.118198956 podStartE2EDuration="6.567478971s" podCreationTimestamp="2025-11-22 04:29:37 +0000 UTC" firstStartedPulling="2025-11-22 04:29:38.616716224 +0000 UTC m=+1329.959337411" lastFinishedPulling="2025-11-22 04:29:43.065996239 +0000 UTC m=+1334.408617426" observedRunningTime="2025-11-22 04:29:43.563820465 +0000 UTC m=+1334.906441652" watchObservedRunningTime="2025-11-22 04:29:43.567478971 +0000 UTC m=+1334.910100148" Nov 22 04:29:47 crc kubenswrapper[4699]: I1122 04:29:47.980876 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.416754 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-72t27"] Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.417990 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-72t27" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.424816 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.424947 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.433029 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-72t27"] Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.449798 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88235903-ad93-439b-94a0-cd4afd05370f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-72t27\" (UID: \"88235903-ad93-439b-94a0-cd4afd05370f\") " pod="openstack/nova-cell0-cell-mapping-72t27" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.450100 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88235903-ad93-439b-94a0-cd4afd05370f-scripts\") pod \"nova-cell0-cell-mapping-72t27\" (UID: \"88235903-ad93-439b-94a0-cd4afd05370f\") " pod="openstack/nova-cell0-cell-mapping-72t27" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.450261 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88235903-ad93-439b-94a0-cd4afd05370f-config-data\") pod \"nova-cell0-cell-mapping-72t27\" (UID: \"88235903-ad93-439b-94a0-cd4afd05370f\") " pod="openstack/nova-cell0-cell-mapping-72t27" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.450295 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbc67\" (UniqueName: \"kubernetes.io/projected/88235903-ad93-439b-94a0-cd4afd05370f-kube-api-access-tbc67\") pod \"nova-cell0-cell-mapping-72t27\" (UID: \"88235903-ad93-439b-94a0-cd4afd05370f\") " pod="openstack/nova-cell0-cell-mapping-72t27" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.551735 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88235903-ad93-439b-94a0-cd4afd05370f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-72t27\" (UID: \"88235903-ad93-439b-94a0-cd4afd05370f\") " pod="openstack/nova-cell0-cell-mapping-72t27" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.551869 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88235903-ad93-439b-94a0-cd4afd05370f-scripts\") pod \"nova-cell0-cell-mapping-72t27\" (UID: \"88235903-ad93-439b-94a0-cd4afd05370f\") " pod="openstack/nova-cell0-cell-mapping-72t27" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.551956 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88235903-ad93-439b-94a0-cd4afd05370f-config-data\") pod \"nova-cell0-cell-mapping-72t27\" (UID: \"88235903-ad93-439b-94a0-cd4afd05370f\") " pod="openstack/nova-cell0-cell-mapping-72t27" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.551983 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbc67\" (UniqueName: \"kubernetes.io/projected/88235903-ad93-439b-94a0-cd4afd05370f-kube-api-access-tbc67\") pod \"nova-cell0-cell-mapping-72t27\" (UID: \"88235903-ad93-439b-94a0-cd4afd05370f\") " pod="openstack/nova-cell0-cell-mapping-72t27" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.558094 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88235903-ad93-439b-94a0-cd4afd05370f-scripts\") pod \"nova-cell0-cell-mapping-72t27\" (UID: \"88235903-ad93-439b-94a0-cd4afd05370f\") " pod="openstack/nova-cell0-cell-mapping-72t27" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.558966 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88235903-ad93-439b-94a0-cd4afd05370f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-72t27\" (UID: \"88235903-ad93-439b-94a0-cd4afd05370f\") " pod="openstack/nova-cell0-cell-mapping-72t27" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.573265 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88235903-ad93-439b-94a0-cd4afd05370f-config-data\") pod \"nova-cell0-cell-mapping-72t27\" (UID: \"88235903-ad93-439b-94a0-cd4afd05370f\") " pod="openstack/nova-cell0-cell-mapping-72t27" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.577055 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbc67\" (UniqueName: \"kubernetes.io/projected/88235903-ad93-439b-94a0-cd4afd05370f-kube-api-access-tbc67\") pod \"nova-cell0-cell-mapping-72t27\" (UID: \"88235903-ad93-439b-94a0-cd4afd05370f\") " pod="openstack/nova-cell0-cell-mapping-72t27" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.596911 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.601384 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.606745 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.617016 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.653998 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45f45400-db99-4fb5-b760-ac5c0f6083ff-config-data\") pod \"nova-api-0\" (UID: \"45f45400-db99-4fb5-b760-ac5c0f6083ff\") " pod="openstack/nova-api-0" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.654125 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45f45400-db99-4fb5-b760-ac5c0f6083ff-logs\") pod \"nova-api-0\" (UID: \"45f45400-db99-4fb5-b760-ac5c0f6083ff\") " pod="openstack/nova-api-0" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.654154 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45f45400-db99-4fb5-b760-ac5c0f6083ff-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"45f45400-db99-4fb5-b760-ac5c0f6083ff\") " pod="openstack/nova-api-0" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.654188 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bf8n2\" (UniqueName: \"kubernetes.io/projected/45f45400-db99-4fb5-b760-ac5c0f6083ff-kube-api-access-bf8n2\") pod \"nova-api-0\" (UID: \"45f45400-db99-4fb5-b760-ac5c0f6083ff\") " pod="openstack/nova-api-0" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.684226 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.685761 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.689666 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.694296 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.744709 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.745731 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-72t27" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.746097 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.748335 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.758108 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45f45400-db99-4fb5-b760-ac5c0f6083ff-config-data\") pod \"nova-api-0\" (UID: \"45f45400-db99-4fb5-b760-ac5c0f6083ff\") " pod="openstack/nova-api-0" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.758146 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f08adc05-ca24-491e-bf3c-ea49d60d9ca8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f08adc05-ca24-491e-bf3c-ea49d60d9ca8\") " pod="openstack/nova-scheduler-0" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.758224 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.758248 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f08adc05-ca24-491e-bf3c-ea49d60d9ca8-config-data\") pod \"nova-scheduler-0\" (UID: \"f08adc05-ca24-491e-bf3c-ea49d60d9ca8\") " pod="openstack/nova-scheduler-0" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.758270 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45f45400-db99-4fb5-b760-ac5c0f6083ff-logs\") pod \"nova-api-0\" (UID: \"45f45400-db99-4fb5-b760-ac5c0f6083ff\") " pod="openstack/nova-api-0" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.758294 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45f45400-db99-4fb5-b760-ac5c0f6083ff-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"45f45400-db99-4fb5-b760-ac5c0f6083ff\") " pod="openstack/nova-api-0" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.758326 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bf8n2\" (UniqueName: \"kubernetes.io/projected/45f45400-db99-4fb5-b760-ac5c0f6083ff-kube-api-access-bf8n2\") pod \"nova-api-0\" (UID: \"45f45400-db99-4fb5-b760-ac5c0f6083ff\") " pod="openstack/nova-api-0" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.758345 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.758368 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thjfx\" (UniqueName: \"kubernetes.io/projected/f08adc05-ca24-491e-bf3c-ea49d60d9ca8-kube-api-access-thjfx\") pod \"nova-scheduler-0\" (UID: \"f08adc05-ca24-491e-bf3c-ea49d60d9ca8\") " pod="openstack/nova-scheduler-0" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.758411 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztmtk\" (UniqueName: \"kubernetes.io/projected/3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c-kube-api-access-ztmtk\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.758920 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45f45400-db99-4fb5-b760-ac5c0f6083ff-logs\") pod \"nova-api-0\" (UID: \"45f45400-db99-4fb5-b760-ac5c0f6083ff\") " pod="openstack/nova-api-0" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.765054 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45f45400-db99-4fb5-b760-ac5c0f6083ff-config-data\") pod \"nova-api-0\" (UID: \"45f45400-db99-4fb5-b760-ac5c0f6083ff\") " pod="openstack/nova-api-0" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.766116 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.781563 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bf8n2\" (UniqueName: \"kubernetes.io/projected/45f45400-db99-4fb5-b760-ac5c0f6083ff-kube-api-access-bf8n2\") pod \"nova-api-0\" (UID: \"45f45400-db99-4fb5-b760-ac5c0f6083ff\") " pod="openstack/nova-api-0" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.802966 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45f45400-db99-4fb5-b760-ac5c0f6083ff-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"45f45400-db99-4fb5-b760-ac5c0f6083ff\") " pod="openstack/nova-api-0" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.845556 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.847003 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.850749 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.864284 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/265829e9-7219-4f62-becf-4f5b3aed77ae-config-data\") pod \"nova-metadata-0\" (UID: \"265829e9-7219-4f62-becf-4f5b3aed77ae\") " pod="openstack/nova-metadata-0" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.864354 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f08adc05-ca24-491e-bf3c-ea49d60d9ca8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f08adc05-ca24-491e-bf3c-ea49d60d9ca8\") " pod="openstack/nova-scheduler-0" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.864406 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/265829e9-7219-4f62-becf-4f5b3aed77ae-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"265829e9-7219-4f62-becf-4f5b3aed77ae\") " pod="openstack/nova-metadata-0" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.864462 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/265829e9-7219-4f62-becf-4f5b3aed77ae-logs\") pod \"nova-metadata-0\" (UID: \"265829e9-7219-4f62-becf-4f5b3aed77ae\") " pod="openstack/nova-metadata-0" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.864501 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.864530 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f08adc05-ca24-491e-bf3c-ea49d60d9ca8-config-data\") pod \"nova-scheduler-0\" (UID: \"f08adc05-ca24-491e-bf3c-ea49d60d9ca8\") " pod="openstack/nova-scheduler-0" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.864561 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcwph\" (UniqueName: \"kubernetes.io/projected/265829e9-7219-4f62-becf-4f5b3aed77ae-kube-api-access-rcwph\") pod \"nova-metadata-0\" (UID: \"265829e9-7219-4f62-becf-4f5b3aed77ae\") " pod="openstack/nova-metadata-0" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.864594 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.864628 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thjfx\" (UniqueName: \"kubernetes.io/projected/f08adc05-ca24-491e-bf3c-ea49d60d9ca8-kube-api-access-thjfx\") pod \"nova-scheduler-0\" (UID: \"f08adc05-ca24-491e-bf3c-ea49d60d9ca8\") " pod="openstack/nova-scheduler-0" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.864672 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztmtk\" (UniqueName: \"kubernetes.io/projected/3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c-kube-api-access-ztmtk\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.900018 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f08adc05-ca24-491e-bf3c-ea49d60d9ca8-config-data\") pod \"nova-scheduler-0\" (UID: \"f08adc05-ca24-491e-bf3c-ea49d60d9ca8\") " pod="openstack/nova-scheduler-0" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.901107 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztmtk\" (UniqueName: \"kubernetes.io/projected/3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c-kube-api-access-ztmtk\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.901319 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f08adc05-ca24-491e-bf3c-ea49d60d9ca8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f08adc05-ca24-491e-bf3c-ea49d60d9ca8\") " pod="openstack/nova-scheduler-0" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.905827 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.908110 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.920081 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thjfx\" (UniqueName: \"kubernetes.io/projected/f08adc05-ca24-491e-bf3c-ea49d60d9ca8-kube-api-access-thjfx\") pod \"nova-scheduler-0\" (UID: \"f08adc05-ca24-491e-bf3c-ea49d60d9ca8\") " pod="openstack/nova-scheduler-0" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.947489 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.959045 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-m55tb"] Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.960551 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.960856 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-m55tb" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.975367 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/265829e9-7219-4f62-becf-4f5b3aed77ae-config-data\") pod \"nova-metadata-0\" (UID: \"265829e9-7219-4f62-becf-4f5b3aed77ae\") " pod="openstack/nova-metadata-0" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.975731 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b73072a8-266e-4352-8f8a-371f64be1988-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-m55tb\" (UID: \"b73072a8-266e-4352-8f8a-371f64be1988\") " pod="openstack/dnsmasq-dns-757b4f8459-m55tb" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.975769 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/265829e9-7219-4f62-becf-4f5b3aed77ae-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"265829e9-7219-4f62-becf-4f5b3aed77ae\") " pod="openstack/nova-metadata-0" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.975794 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h85td\" (UniqueName: \"kubernetes.io/projected/b73072a8-266e-4352-8f8a-371f64be1988-kube-api-access-h85td\") pod \"dnsmasq-dns-757b4f8459-m55tb\" (UID: \"b73072a8-266e-4352-8f8a-371f64be1988\") " pod="openstack/dnsmasq-dns-757b4f8459-m55tb" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.975824 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b73072a8-266e-4352-8f8a-371f64be1988-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-m55tb\" (UID: \"b73072a8-266e-4352-8f8a-371f64be1988\") " pod="openstack/dnsmasq-dns-757b4f8459-m55tb" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.975842 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/265829e9-7219-4f62-becf-4f5b3aed77ae-logs\") pod \"nova-metadata-0\" (UID: \"265829e9-7219-4f62-becf-4f5b3aed77ae\") " pod="openstack/nova-metadata-0" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.975862 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b73072a8-266e-4352-8f8a-371f64be1988-dns-svc\") pod \"dnsmasq-dns-757b4f8459-m55tb\" (UID: \"b73072a8-266e-4352-8f8a-371f64be1988\") " pod="openstack/dnsmasq-dns-757b4f8459-m55tb" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.975936 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcwph\" (UniqueName: \"kubernetes.io/projected/265829e9-7219-4f62-becf-4f5b3aed77ae-kube-api-access-rcwph\") pod \"nova-metadata-0\" (UID: \"265829e9-7219-4f62-becf-4f5b3aed77ae\") " pod="openstack/nova-metadata-0" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.975971 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b73072a8-266e-4352-8f8a-371f64be1988-config\") pod \"dnsmasq-dns-757b4f8459-m55tb\" (UID: \"b73072a8-266e-4352-8f8a-371f64be1988\") " pod="openstack/dnsmasq-dns-757b4f8459-m55tb" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.976016 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b73072a8-266e-4352-8f8a-371f64be1988-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-m55tb\" (UID: \"b73072a8-266e-4352-8f8a-371f64be1988\") " pod="openstack/dnsmasq-dns-757b4f8459-m55tb" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.979637 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.980512 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/265829e9-7219-4f62-becf-4f5b3aed77ae-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"265829e9-7219-4f62-becf-4f5b3aed77ae\") " pod="openstack/nova-metadata-0" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.980978 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/265829e9-7219-4f62-becf-4f5b3aed77ae-config-data\") pod \"nova-metadata-0\" (UID: \"265829e9-7219-4f62-becf-4f5b3aed77ae\") " pod="openstack/nova-metadata-0" Nov 22 04:29:48 crc kubenswrapper[4699]: I1122 04:29:48.983030 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/265829e9-7219-4f62-becf-4f5b3aed77ae-logs\") pod \"nova-metadata-0\" (UID: \"265829e9-7219-4f62-becf-4f5b3aed77ae\") " pod="openstack/nova-metadata-0" Nov 22 04:29:49 crc kubenswrapper[4699]: I1122 04:29:49.006213 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-m55tb"] Nov 22 04:29:49 crc kubenswrapper[4699]: I1122 04:29:49.019731 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcwph\" (UniqueName: \"kubernetes.io/projected/265829e9-7219-4f62-becf-4f5b3aed77ae-kube-api-access-rcwph\") pod \"nova-metadata-0\" (UID: \"265829e9-7219-4f62-becf-4f5b3aed77ae\") " pod="openstack/nova-metadata-0" Nov 22 04:29:49 crc kubenswrapper[4699]: I1122 04:29:49.024977 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 04:29:49 crc kubenswrapper[4699]: I1122 04:29:49.077789 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b73072a8-266e-4352-8f8a-371f64be1988-config\") pod \"dnsmasq-dns-757b4f8459-m55tb\" (UID: \"b73072a8-266e-4352-8f8a-371f64be1988\") " pod="openstack/dnsmasq-dns-757b4f8459-m55tb" Nov 22 04:29:49 crc kubenswrapper[4699]: I1122 04:29:49.077853 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b73072a8-266e-4352-8f8a-371f64be1988-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-m55tb\" (UID: \"b73072a8-266e-4352-8f8a-371f64be1988\") " pod="openstack/dnsmasq-dns-757b4f8459-m55tb" Nov 22 04:29:49 crc kubenswrapper[4699]: I1122 04:29:49.077999 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b73072a8-266e-4352-8f8a-371f64be1988-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-m55tb\" (UID: \"b73072a8-266e-4352-8f8a-371f64be1988\") " pod="openstack/dnsmasq-dns-757b4f8459-m55tb" Nov 22 04:29:49 crc kubenswrapper[4699]: I1122 04:29:49.078039 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h85td\" (UniqueName: \"kubernetes.io/projected/b73072a8-266e-4352-8f8a-371f64be1988-kube-api-access-h85td\") pod \"dnsmasq-dns-757b4f8459-m55tb\" (UID: \"b73072a8-266e-4352-8f8a-371f64be1988\") " pod="openstack/dnsmasq-dns-757b4f8459-m55tb" Nov 22 04:29:49 crc kubenswrapper[4699]: I1122 04:29:49.078069 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b73072a8-266e-4352-8f8a-371f64be1988-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-m55tb\" (UID: \"b73072a8-266e-4352-8f8a-371f64be1988\") " pod="openstack/dnsmasq-dns-757b4f8459-m55tb" Nov 22 04:29:49 crc kubenswrapper[4699]: I1122 04:29:49.078099 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b73072a8-266e-4352-8f8a-371f64be1988-dns-svc\") pod \"dnsmasq-dns-757b4f8459-m55tb\" (UID: \"b73072a8-266e-4352-8f8a-371f64be1988\") " pod="openstack/dnsmasq-dns-757b4f8459-m55tb" Nov 22 04:29:49 crc kubenswrapper[4699]: I1122 04:29:49.081452 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b73072a8-266e-4352-8f8a-371f64be1988-dns-svc\") pod \"dnsmasq-dns-757b4f8459-m55tb\" (UID: \"b73072a8-266e-4352-8f8a-371f64be1988\") " pod="openstack/dnsmasq-dns-757b4f8459-m55tb" Nov 22 04:29:49 crc kubenswrapper[4699]: I1122 04:29:49.082579 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b73072a8-266e-4352-8f8a-371f64be1988-config\") pod \"dnsmasq-dns-757b4f8459-m55tb\" (UID: \"b73072a8-266e-4352-8f8a-371f64be1988\") " pod="openstack/dnsmasq-dns-757b4f8459-m55tb" Nov 22 04:29:49 crc kubenswrapper[4699]: I1122 04:29:49.086673 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b73072a8-266e-4352-8f8a-371f64be1988-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-m55tb\" (UID: \"b73072a8-266e-4352-8f8a-371f64be1988\") " pod="openstack/dnsmasq-dns-757b4f8459-m55tb" Nov 22 04:29:49 crc kubenswrapper[4699]: I1122 04:29:49.086760 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b73072a8-266e-4352-8f8a-371f64be1988-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-m55tb\" (UID: \"b73072a8-266e-4352-8f8a-371f64be1988\") " pod="openstack/dnsmasq-dns-757b4f8459-m55tb" Nov 22 04:29:49 crc kubenswrapper[4699]: I1122 04:29:49.087395 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b73072a8-266e-4352-8f8a-371f64be1988-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-m55tb\" (UID: \"b73072a8-266e-4352-8f8a-371f64be1988\") " pod="openstack/dnsmasq-dns-757b4f8459-m55tb" Nov 22 04:29:49 crc kubenswrapper[4699]: I1122 04:29:49.134674 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h85td\" (UniqueName: \"kubernetes.io/projected/b73072a8-266e-4352-8f8a-371f64be1988-kube-api-access-h85td\") pod \"dnsmasq-dns-757b4f8459-m55tb\" (UID: \"b73072a8-266e-4352-8f8a-371f64be1988\") " pod="openstack/dnsmasq-dns-757b4f8459-m55tb" Nov 22 04:29:49 crc kubenswrapper[4699]: I1122 04:29:49.306872 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 04:29:49 crc kubenswrapper[4699]: I1122 04:29:49.360724 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-m55tb" Nov 22 04:29:49 crc kubenswrapper[4699]: I1122 04:29:49.402763 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-72t27"] Nov 22 04:29:49 crc kubenswrapper[4699]: I1122 04:29:49.625685 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-72t27" event={"ID":"88235903-ad93-439b-94a0-cd4afd05370f","Type":"ContainerStarted","Data":"2fd47d709b91d2f454a310a2d470c5405695da62aac4bc990772cfc28ea05fee"} Nov 22 04:29:49 crc kubenswrapper[4699]: I1122 04:29:49.626711 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 04:29:49 crc kubenswrapper[4699]: I1122 04:29:49.674506 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zvrvx"] Nov 22 04:29:49 crc kubenswrapper[4699]: I1122 04:29:49.676029 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zvrvx" Nov 22 04:29:49 crc kubenswrapper[4699]: I1122 04:29:49.679927 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 22 04:29:49 crc kubenswrapper[4699]: I1122 04:29:49.680584 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Nov 22 04:29:49 crc kubenswrapper[4699]: I1122 04:29:49.688308 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zvrvx"] Nov 22 04:29:49 crc kubenswrapper[4699]: I1122 04:29:49.696253 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1-scripts\") pod \"nova-cell1-conductor-db-sync-zvrvx\" (UID: \"cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1\") " pod="openstack/nova-cell1-conductor-db-sync-zvrvx" Nov 22 04:29:49 crc kubenswrapper[4699]: I1122 04:29:49.696668 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1-config-data\") pod \"nova-cell1-conductor-db-sync-zvrvx\" (UID: \"cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1\") " pod="openstack/nova-cell1-conductor-db-sync-zvrvx" Nov 22 04:29:49 crc kubenswrapper[4699]: I1122 04:29:49.696801 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ml2v\" (UniqueName: \"kubernetes.io/projected/cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1-kube-api-access-4ml2v\") pod \"nova-cell1-conductor-db-sync-zvrvx\" (UID: \"cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1\") " pod="openstack/nova-cell1-conductor-db-sync-zvrvx" Nov 22 04:29:49 crc kubenswrapper[4699]: I1122 04:29:49.697029 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-zvrvx\" (UID: \"cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1\") " pod="openstack/nova-cell1-conductor-db-sync-zvrvx" Nov 22 04:29:49 crc kubenswrapper[4699]: I1122 04:29:49.798900 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1-config-data\") pod \"nova-cell1-conductor-db-sync-zvrvx\" (UID: \"cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1\") " pod="openstack/nova-cell1-conductor-db-sync-zvrvx" Nov 22 04:29:49 crc kubenswrapper[4699]: I1122 04:29:49.798950 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ml2v\" (UniqueName: \"kubernetes.io/projected/cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1-kube-api-access-4ml2v\") pod \"nova-cell1-conductor-db-sync-zvrvx\" (UID: \"cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1\") " pod="openstack/nova-cell1-conductor-db-sync-zvrvx" Nov 22 04:29:49 crc kubenswrapper[4699]: I1122 04:29:49.799045 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-zvrvx\" (UID: \"cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1\") " pod="openstack/nova-cell1-conductor-db-sync-zvrvx" Nov 22 04:29:49 crc kubenswrapper[4699]: I1122 04:29:49.799125 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1-scripts\") pod \"nova-cell1-conductor-db-sync-zvrvx\" (UID: \"cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1\") " pod="openstack/nova-cell1-conductor-db-sync-zvrvx" Nov 22 04:29:49 crc kubenswrapper[4699]: I1122 04:29:49.803741 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1-scripts\") pod \"nova-cell1-conductor-db-sync-zvrvx\" (UID: \"cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1\") " pod="openstack/nova-cell1-conductor-db-sync-zvrvx" Nov 22 04:29:49 crc kubenswrapper[4699]: I1122 04:29:49.804278 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-zvrvx\" (UID: \"cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1\") " pod="openstack/nova-cell1-conductor-db-sync-zvrvx" Nov 22 04:29:49 crc kubenswrapper[4699]: I1122 04:29:49.814177 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1-config-data\") pod \"nova-cell1-conductor-db-sync-zvrvx\" (UID: \"cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1\") " pod="openstack/nova-cell1-conductor-db-sync-zvrvx" Nov 22 04:29:49 crc kubenswrapper[4699]: I1122 04:29:49.840478 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ml2v\" (UniqueName: \"kubernetes.io/projected/cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1-kube-api-access-4ml2v\") pod \"nova-cell1-conductor-db-sync-zvrvx\" (UID: \"cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1\") " pod="openstack/nova-cell1-conductor-db-sync-zvrvx" Nov 22 04:29:49 crc kubenswrapper[4699]: I1122 04:29:49.956916 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 04:29:49 crc kubenswrapper[4699]: W1122 04:29:49.965967 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod265829e9_7219_4f62_becf_4f5b3aed77ae.slice/crio-f4303be0c70cca5c81f265d3fd163c8bcdc344011c6d893f72daa461ed2d868a WatchSource:0}: Error finding container f4303be0c70cca5c81f265d3fd163c8bcdc344011c6d893f72daa461ed2d868a: Status 404 returned error can't find the container with id f4303be0c70cca5c81f265d3fd163c8bcdc344011c6d893f72daa461ed2d868a Nov 22 04:29:49 crc kubenswrapper[4699]: I1122 04:29:49.970629 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 04:29:49 crc kubenswrapper[4699]: I1122 04:29:49.979972 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 04:29:50 crc kubenswrapper[4699]: I1122 04:29:50.096676 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zvrvx" Nov 22 04:29:50 crc kubenswrapper[4699]: I1122 04:29:50.120933 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-m55tb"] Nov 22 04:29:50 crc kubenswrapper[4699]: I1122 04:29:50.607007 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zvrvx"] Nov 22 04:29:50 crc kubenswrapper[4699]: W1122 04:29:50.617200 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcfd327ba_6eeb_41a6_95f9_f2ad2385fcd1.slice/crio-250ae7f8ace6216652a7d09d8e2afd30e70170f851f05c6ed8411df7ccf6fabe WatchSource:0}: Error finding container 250ae7f8ace6216652a7d09d8e2afd30e70170f851f05c6ed8411df7ccf6fabe: Status 404 returned error can't find the container with id 250ae7f8ace6216652a7d09d8e2afd30e70170f851f05c6ed8411df7ccf6fabe Nov 22 04:29:50 crc kubenswrapper[4699]: I1122 04:29:50.653351 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"265829e9-7219-4f62-becf-4f5b3aed77ae","Type":"ContainerStarted","Data":"f4303be0c70cca5c81f265d3fd163c8bcdc344011c6d893f72daa461ed2d868a"} Nov 22 04:29:50 crc kubenswrapper[4699]: I1122 04:29:50.659714 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"45f45400-db99-4fb5-b760-ac5c0f6083ff","Type":"ContainerStarted","Data":"a783412b472c851b66d15220ffe164c28477ab367d4d99ef54a27d8c8a773b19"} Nov 22 04:29:50 crc kubenswrapper[4699]: I1122 04:29:50.665268 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c","Type":"ContainerStarted","Data":"30cf6bf49210f9b9bdba9fa2d0513ea94a5c2b08560592417a06252b217df8e0"} Nov 22 04:29:50 crc kubenswrapper[4699]: I1122 04:29:50.683859 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zvrvx" event={"ID":"cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1","Type":"ContainerStarted","Data":"250ae7f8ace6216652a7d09d8e2afd30e70170f851f05c6ed8411df7ccf6fabe"} Nov 22 04:29:50 crc kubenswrapper[4699]: I1122 04:29:50.689874 4699 generic.go:334] "Generic (PLEG): container finished" podID="b73072a8-266e-4352-8f8a-371f64be1988" containerID="8962410891dbd7d7cd624bd791f4233fad8ba12296a4b271e51cd53b7a12e51b" exitCode=0 Nov 22 04:29:50 crc kubenswrapper[4699]: I1122 04:29:50.690049 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-m55tb" event={"ID":"b73072a8-266e-4352-8f8a-371f64be1988","Type":"ContainerDied","Data":"8962410891dbd7d7cd624bd791f4233fad8ba12296a4b271e51cd53b7a12e51b"} Nov 22 04:29:50 crc kubenswrapper[4699]: I1122 04:29:50.693559 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-m55tb" event={"ID":"b73072a8-266e-4352-8f8a-371f64be1988","Type":"ContainerStarted","Data":"0f10df95e4ab69fa6c7043fb44e2e61e898c87df9ab517ceae3cd92ccaf1b686"} Nov 22 04:29:50 crc kubenswrapper[4699]: I1122 04:29:50.696800 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-72t27" event={"ID":"88235903-ad93-439b-94a0-cd4afd05370f","Type":"ContainerStarted","Data":"2c5de007d8fb94ac8a47a48b15acdd5cbfebbea9818405299e4042a67be3a215"} Nov 22 04:29:50 crc kubenswrapper[4699]: I1122 04:29:50.700028 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f08adc05-ca24-491e-bf3c-ea49d60d9ca8","Type":"ContainerStarted","Data":"a4a32210bfd85fba344901abbb5e4107666f0818475921db3def910de56575ec"} Nov 22 04:29:50 crc kubenswrapper[4699]: I1122 04:29:50.740989 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-72t27" podStartSLOduration=2.740972732 podStartE2EDuration="2.740972732s" podCreationTimestamp="2025-11-22 04:29:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:29:50.738067744 +0000 UTC m=+1342.080688941" watchObservedRunningTime="2025-11-22 04:29:50.740972732 +0000 UTC m=+1342.083593919" Nov 22 04:29:51 crc kubenswrapper[4699]: I1122 04:29:51.726742 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zvrvx" event={"ID":"cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1","Type":"ContainerStarted","Data":"4f8c36580a32469f7b1a2b1c6ff3454a6d29bd61b38ca805d1c2fc218f05457c"} Nov 22 04:29:51 crc kubenswrapper[4699]: I1122 04:29:51.735510 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-m55tb" event={"ID":"b73072a8-266e-4352-8f8a-371f64be1988","Type":"ContainerStarted","Data":"dffbde150b14f2f342df64f5c7e4cfd56ab88a5e00d59a77ba5240fb57ce1fa3"} Nov 22 04:29:51 crc kubenswrapper[4699]: I1122 04:29:51.737744 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757b4f8459-m55tb" Nov 22 04:29:51 crc kubenswrapper[4699]: I1122 04:29:51.758345 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-zvrvx" podStartSLOduration=2.7583233160000002 podStartE2EDuration="2.758323316s" podCreationTimestamp="2025-11-22 04:29:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:29:51.746387014 +0000 UTC m=+1343.089008221" watchObservedRunningTime="2025-11-22 04:29:51.758323316 +0000 UTC m=+1343.100944503" Nov 22 04:29:51 crc kubenswrapper[4699]: I1122 04:29:51.773364 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757b4f8459-m55tb" podStartSLOduration=3.7733431299999998 podStartE2EDuration="3.77334313s" podCreationTimestamp="2025-11-22 04:29:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:29:51.766281844 +0000 UTC m=+1343.108903051" watchObservedRunningTime="2025-11-22 04:29:51.77334313 +0000 UTC m=+1343.115964317" Nov 22 04:29:52 crc kubenswrapper[4699]: I1122 04:29:52.259019 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 04:29:52 crc kubenswrapper[4699]: I1122 04:29:52.272069 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 04:29:53 crc kubenswrapper[4699]: I1122 04:29:53.760163 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"265829e9-7219-4f62-becf-4f5b3aed77ae","Type":"ContainerStarted","Data":"963f8fd42d2f11a6959168dff5baf3bc49ac6601207b0f0078f213d729c3b600"} Nov 22 04:29:53 crc kubenswrapper[4699]: I1122 04:29:53.762354 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"45f45400-db99-4fb5-b760-ac5c0f6083ff","Type":"ContainerStarted","Data":"97f7f9f27e420661e41d2c446ead9e9fb95086e918d77bf31b6a9e96c3bdf8b9"} Nov 22 04:29:53 crc kubenswrapper[4699]: I1122 04:29:53.766818 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c","Type":"ContainerStarted","Data":"c9e75a351eef56340e30a3fd159bb197d4d0e4720077044dccb87ea5eabdb207"} Nov 22 04:29:53 crc kubenswrapper[4699]: I1122 04:29:53.766918 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://c9e75a351eef56340e30a3fd159bb197d4d0e4720077044dccb87ea5eabdb207" gracePeriod=30 Nov 22 04:29:53 crc kubenswrapper[4699]: I1122 04:29:53.770337 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f08adc05-ca24-491e-bf3c-ea49d60d9ca8","Type":"ContainerStarted","Data":"f1e0782e4b7cbb15a275d8f80e9f0ef0b72d79f4ae89c424dfb567897ea3e295"} Nov 22 04:29:53 crc kubenswrapper[4699]: I1122 04:29:53.793882 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.277146929 podStartE2EDuration="5.793855762s" podCreationTimestamp="2025-11-22 04:29:48 +0000 UTC" firstStartedPulling="2025-11-22 04:29:49.661549005 +0000 UTC m=+1341.004170192" lastFinishedPulling="2025-11-22 04:29:53.178257838 +0000 UTC m=+1344.520879025" observedRunningTime="2025-11-22 04:29:53.791969617 +0000 UTC m=+1345.134590814" watchObservedRunningTime="2025-11-22 04:29:53.793855762 +0000 UTC m=+1345.136476949" Nov 22 04:29:53 crc kubenswrapper[4699]: I1122 04:29:53.819796 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.609507629 podStartE2EDuration="5.819771083s" podCreationTimestamp="2025-11-22 04:29:48 +0000 UTC" firstStartedPulling="2025-11-22 04:29:49.983285615 +0000 UTC m=+1341.325906802" lastFinishedPulling="2025-11-22 04:29:53.193549069 +0000 UTC m=+1344.536170256" observedRunningTime="2025-11-22 04:29:53.811544499 +0000 UTC m=+1345.154165706" watchObservedRunningTime="2025-11-22 04:29:53.819771083 +0000 UTC m=+1345.162392270" Nov 22 04:29:53 crc kubenswrapper[4699]: I1122 04:29:53.961698 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Nov 22 04:29:54 crc kubenswrapper[4699]: I1122 04:29:54.025715 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 22 04:29:54 crc kubenswrapper[4699]: I1122 04:29:54.794039 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"45f45400-db99-4fb5-b760-ac5c0f6083ff","Type":"ContainerStarted","Data":"28c4b4e224afb6e386fc28d1e5ab5aaeccfc503c99fe360191a73e644398c5d5"} Nov 22 04:29:54 crc kubenswrapper[4699]: I1122 04:29:54.799600 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="265829e9-7219-4f62-becf-4f5b3aed77ae" containerName="nova-metadata-log" containerID="cri-o://963f8fd42d2f11a6959168dff5baf3bc49ac6601207b0f0078f213d729c3b600" gracePeriod=30 Nov 22 04:29:54 crc kubenswrapper[4699]: I1122 04:29:54.799902 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"265829e9-7219-4f62-becf-4f5b3aed77ae","Type":"ContainerStarted","Data":"b4c591cadcbdf8a1250f9399e79d45c5da794921a6fe3958199688a4918899ff"} Nov 22 04:29:54 crc kubenswrapper[4699]: I1122 04:29:54.800268 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="265829e9-7219-4f62-becf-4f5b3aed77ae" containerName="nova-metadata-metadata" containerID="cri-o://b4c591cadcbdf8a1250f9399e79d45c5da794921a6fe3958199688a4918899ff" gracePeriod=30 Nov 22 04:29:54 crc kubenswrapper[4699]: I1122 04:29:54.834694 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.640345812 podStartE2EDuration="6.834661029s" podCreationTimestamp="2025-11-22 04:29:48 +0000 UTC" firstStartedPulling="2025-11-22 04:29:49.983143422 +0000 UTC m=+1341.325764609" lastFinishedPulling="2025-11-22 04:29:53.177458639 +0000 UTC m=+1344.520079826" observedRunningTime="2025-11-22 04:29:54.812303231 +0000 UTC m=+1346.154924428" watchObservedRunningTime="2025-11-22 04:29:54.834661029 +0000 UTC m=+1346.177282236" Nov 22 04:29:54 crc kubenswrapper[4699]: I1122 04:29:54.854618 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.647108371 podStartE2EDuration="6.854514817s" podCreationTimestamp="2025-11-22 04:29:48 +0000 UTC" firstStartedPulling="2025-11-22 04:29:49.971946458 +0000 UTC m=+1341.314567645" lastFinishedPulling="2025-11-22 04:29:53.179352864 +0000 UTC m=+1344.521974091" observedRunningTime="2025-11-22 04:29:54.835614531 +0000 UTC m=+1346.178235728" watchObservedRunningTime="2025-11-22 04:29:54.854514817 +0000 UTC m=+1346.197136024" Nov 22 04:29:55 crc kubenswrapper[4699]: I1122 04:29:55.402866 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 04:29:55 crc kubenswrapper[4699]: I1122 04:29:55.456903 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcwph\" (UniqueName: \"kubernetes.io/projected/265829e9-7219-4f62-becf-4f5b3aed77ae-kube-api-access-rcwph\") pod \"265829e9-7219-4f62-becf-4f5b3aed77ae\" (UID: \"265829e9-7219-4f62-becf-4f5b3aed77ae\") " Nov 22 04:29:55 crc kubenswrapper[4699]: I1122 04:29:55.457041 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/265829e9-7219-4f62-becf-4f5b3aed77ae-config-data\") pod \"265829e9-7219-4f62-becf-4f5b3aed77ae\" (UID: \"265829e9-7219-4f62-becf-4f5b3aed77ae\") " Nov 22 04:29:55 crc kubenswrapper[4699]: I1122 04:29:55.457156 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/265829e9-7219-4f62-becf-4f5b3aed77ae-combined-ca-bundle\") pod \"265829e9-7219-4f62-becf-4f5b3aed77ae\" (UID: \"265829e9-7219-4f62-becf-4f5b3aed77ae\") " Nov 22 04:29:55 crc kubenswrapper[4699]: I1122 04:29:55.457218 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/265829e9-7219-4f62-becf-4f5b3aed77ae-logs\") pod \"265829e9-7219-4f62-becf-4f5b3aed77ae\" (UID: \"265829e9-7219-4f62-becf-4f5b3aed77ae\") " Nov 22 04:29:55 crc kubenswrapper[4699]: I1122 04:29:55.458357 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/265829e9-7219-4f62-becf-4f5b3aed77ae-logs" (OuterVolumeSpecName: "logs") pod "265829e9-7219-4f62-becf-4f5b3aed77ae" (UID: "265829e9-7219-4f62-becf-4f5b3aed77ae"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:29:55 crc kubenswrapper[4699]: I1122 04:29:55.464943 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/265829e9-7219-4f62-becf-4f5b3aed77ae-kube-api-access-rcwph" (OuterVolumeSpecName: "kube-api-access-rcwph") pod "265829e9-7219-4f62-becf-4f5b3aed77ae" (UID: "265829e9-7219-4f62-becf-4f5b3aed77ae"). InnerVolumeSpecName "kube-api-access-rcwph". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:29:55 crc kubenswrapper[4699]: I1122 04:29:55.488280 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/265829e9-7219-4f62-becf-4f5b3aed77ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "265829e9-7219-4f62-becf-4f5b3aed77ae" (UID: "265829e9-7219-4f62-becf-4f5b3aed77ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:29:55 crc kubenswrapper[4699]: I1122 04:29:55.502578 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/265829e9-7219-4f62-becf-4f5b3aed77ae-config-data" (OuterVolumeSpecName: "config-data") pod "265829e9-7219-4f62-becf-4f5b3aed77ae" (UID: "265829e9-7219-4f62-becf-4f5b3aed77ae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:29:55 crc kubenswrapper[4699]: I1122 04:29:55.559376 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcwph\" (UniqueName: \"kubernetes.io/projected/265829e9-7219-4f62-becf-4f5b3aed77ae-kube-api-access-rcwph\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:55 crc kubenswrapper[4699]: I1122 04:29:55.559534 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/265829e9-7219-4f62-becf-4f5b3aed77ae-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:55 crc kubenswrapper[4699]: I1122 04:29:55.560063 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/265829e9-7219-4f62-becf-4f5b3aed77ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:55 crc kubenswrapper[4699]: I1122 04:29:55.560087 4699 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/265829e9-7219-4f62-becf-4f5b3aed77ae-logs\") on node \"crc\" DevicePath \"\"" Nov 22 04:29:55 crc kubenswrapper[4699]: I1122 04:29:55.807950 4699 generic.go:334] "Generic (PLEG): container finished" podID="265829e9-7219-4f62-becf-4f5b3aed77ae" containerID="b4c591cadcbdf8a1250f9399e79d45c5da794921a6fe3958199688a4918899ff" exitCode=0 Nov 22 04:29:55 crc kubenswrapper[4699]: I1122 04:29:55.808009 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"265829e9-7219-4f62-becf-4f5b3aed77ae","Type":"ContainerDied","Data":"b4c591cadcbdf8a1250f9399e79d45c5da794921a6fe3958199688a4918899ff"} Nov 22 04:29:55 crc kubenswrapper[4699]: I1122 04:29:55.808041 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"265829e9-7219-4f62-becf-4f5b3aed77ae","Type":"ContainerDied","Data":"963f8fd42d2f11a6959168dff5baf3bc49ac6601207b0f0078f213d729c3b600"} Nov 22 04:29:55 crc kubenswrapper[4699]: I1122 04:29:55.807984 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 04:29:55 crc kubenswrapper[4699]: I1122 04:29:55.808016 4699 generic.go:334] "Generic (PLEG): container finished" podID="265829e9-7219-4f62-becf-4f5b3aed77ae" containerID="963f8fd42d2f11a6959168dff5baf3bc49ac6601207b0f0078f213d729c3b600" exitCode=143 Nov 22 04:29:55 crc kubenswrapper[4699]: I1122 04:29:55.808093 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"265829e9-7219-4f62-becf-4f5b3aed77ae","Type":"ContainerDied","Data":"f4303be0c70cca5c81f265d3fd163c8bcdc344011c6d893f72daa461ed2d868a"} Nov 22 04:29:55 crc kubenswrapper[4699]: I1122 04:29:55.808118 4699 scope.go:117] "RemoveContainer" containerID="b4c591cadcbdf8a1250f9399e79d45c5da794921a6fe3958199688a4918899ff" Nov 22 04:29:55 crc kubenswrapper[4699]: I1122 04:29:55.835449 4699 scope.go:117] "RemoveContainer" containerID="963f8fd42d2f11a6959168dff5baf3bc49ac6601207b0f0078f213d729c3b600" Nov 22 04:29:55 crc kubenswrapper[4699]: I1122 04:29:55.862078 4699 scope.go:117] "RemoveContainer" containerID="b4c591cadcbdf8a1250f9399e79d45c5da794921a6fe3958199688a4918899ff" Nov 22 04:29:55 crc kubenswrapper[4699]: E1122 04:29:55.862517 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4c591cadcbdf8a1250f9399e79d45c5da794921a6fe3958199688a4918899ff\": container with ID starting with b4c591cadcbdf8a1250f9399e79d45c5da794921a6fe3958199688a4918899ff not found: ID does not exist" containerID="b4c591cadcbdf8a1250f9399e79d45c5da794921a6fe3958199688a4918899ff" Nov 22 04:29:55 crc kubenswrapper[4699]: I1122 04:29:55.862555 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4c591cadcbdf8a1250f9399e79d45c5da794921a6fe3958199688a4918899ff"} err="failed to get container status \"b4c591cadcbdf8a1250f9399e79d45c5da794921a6fe3958199688a4918899ff\": rpc error: code = NotFound desc = could not find container \"b4c591cadcbdf8a1250f9399e79d45c5da794921a6fe3958199688a4918899ff\": container with ID starting with b4c591cadcbdf8a1250f9399e79d45c5da794921a6fe3958199688a4918899ff not found: ID does not exist" Nov 22 04:29:55 crc kubenswrapper[4699]: I1122 04:29:55.862580 4699 scope.go:117] "RemoveContainer" containerID="963f8fd42d2f11a6959168dff5baf3bc49ac6601207b0f0078f213d729c3b600" Nov 22 04:29:55 crc kubenswrapper[4699]: I1122 04:29:55.864124 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 04:29:55 crc kubenswrapper[4699]: E1122 04:29:55.864209 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"963f8fd42d2f11a6959168dff5baf3bc49ac6601207b0f0078f213d729c3b600\": container with ID starting with 963f8fd42d2f11a6959168dff5baf3bc49ac6601207b0f0078f213d729c3b600 not found: ID does not exist" containerID="963f8fd42d2f11a6959168dff5baf3bc49ac6601207b0f0078f213d729c3b600" Nov 22 04:29:55 crc kubenswrapper[4699]: I1122 04:29:55.864238 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"963f8fd42d2f11a6959168dff5baf3bc49ac6601207b0f0078f213d729c3b600"} err="failed to get container status \"963f8fd42d2f11a6959168dff5baf3bc49ac6601207b0f0078f213d729c3b600\": rpc error: code = NotFound desc = could not find container \"963f8fd42d2f11a6959168dff5baf3bc49ac6601207b0f0078f213d729c3b600\": container with ID starting with 963f8fd42d2f11a6959168dff5baf3bc49ac6601207b0f0078f213d729c3b600 not found: ID does not exist" Nov 22 04:29:55 crc kubenswrapper[4699]: I1122 04:29:55.864258 4699 scope.go:117] "RemoveContainer" containerID="b4c591cadcbdf8a1250f9399e79d45c5da794921a6fe3958199688a4918899ff" Nov 22 04:29:55 crc kubenswrapper[4699]: I1122 04:29:55.864633 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4c591cadcbdf8a1250f9399e79d45c5da794921a6fe3958199688a4918899ff"} err="failed to get container status \"b4c591cadcbdf8a1250f9399e79d45c5da794921a6fe3958199688a4918899ff\": rpc error: code = NotFound desc = could not find container \"b4c591cadcbdf8a1250f9399e79d45c5da794921a6fe3958199688a4918899ff\": container with ID starting with b4c591cadcbdf8a1250f9399e79d45c5da794921a6fe3958199688a4918899ff not found: ID does not exist" Nov 22 04:29:55 crc kubenswrapper[4699]: I1122 04:29:55.864657 4699 scope.go:117] "RemoveContainer" containerID="963f8fd42d2f11a6959168dff5baf3bc49ac6601207b0f0078f213d729c3b600" Nov 22 04:29:55 crc kubenswrapper[4699]: I1122 04:29:55.864894 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"963f8fd42d2f11a6959168dff5baf3bc49ac6601207b0f0078f213d729c3b600"} err="failed to get container status \"963f8fd42d2f11a6959168dff5baf3bc49ac6601207b0f0078f213d729c3b600\": rpc error: code = NotFound desc = could not find container \"963f8fd42d2f11a6959168dff5baf3bc49ac6601207b0f0078f213d729c3b600\": container with ID starting with 963f8fd42d2f11a6959168dff5baf3bc49ac6601207b0f0078f213d729c3b600 not found: ID does not exist" Nov 22 04:29:55 crc kubenswrapper[4699]: I1122 04:29:55.892720 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 04:29:55 crc kubenswrapper[4699]: I1122 04:29:55.919882 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 22 04:29:55 crc kubenswrapper[4699]: E1122 04:29:55.920310 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="265829e9-7219-4f62-becf-4f5b3aed77ae" containerName="nova-metadata-log" Nov 22 04:29:55 crc kubenswrapper[4699]: I1122 04:29:55.920325 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="265829e9-7219-4f62-becf-4f5b3aed77ae" containerName="nova-metadata-log" Nov 22 04:29:55 crc kubenswrapper[4699]: E1122 04:29:55.920341 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="265829e9-7219-4f62-becf-4f5b3aed77ae" containerName="nova-metadata-metadata" Nov 22 04:29:55 crc kubenswrapper[4699]: I1122 04:29:55.920349 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="265829e9-7219-4f62-becf-4f5b3aed77ae" containerName="nova-metadata-metadata" Nov 22 04:29:55 crc kubenswrapper[4699]: I1122 04:29:55.920570 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="265829e9-7219-4f62-becf-4f5b3aed77ae" containerName="nova-metadata-log" Nov 22 04:29:55 crc kubenswrapper[4699]: I1122 04:29:55.920601 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="265829e9-7219-4f62-becf-4f5b3aed77ae" containerName="nova-metadata-metadata" Nov 22 04:29:55 crc kubenswrapper[4699]: I1122 04:29:55.921645 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 04:29:55 crc kubenswrapper[4699]: I1122 04:29:55.923452 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 22 04:29:55 crc kubenswrapper[4699]: I1122 04:29:55.923784 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 22 04:29:55 crc kubenswrapper[4699]: I1122 04:29:55.931949 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 04:29:55 crc kubenswrapper[4699]: I1122 04:29:55.967035 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7l9b\" (UniqueName: \"kubernetes.io/projected/ba50e6f7-ac95-401c-bd9d-b17c3bb807ac-kube-api-access-k7l9b\") pod \"nova-metadata-0\" (UID: \"ba50e6f7-ac95-401c-bd9d-b17c3bb807ac\") " pod="openstack/nova-metadata-0" Nov 22 04:29:55 crc kubenswrapper[4699]: I1122 04:29:55.967126 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba50e6f7-ac95-401c-bd9d-b17c3bb807ac-config-data\") pod \"nova-metadata-0\" (UID: \"ba50e6f7-ac95-401c-bd9d-b17c3bb807ac\") " pod="openstack/nova-metadata-0" Nov 22 04:29:55 crc kubenswrapper[4699]: I1122 04:29:55.967325 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba50e6f7-ac95-401c-bd9d-b17c3bb807ac-logs\") pod \"nova-metadata-0\" (UID: \"ba50e6f7-ac95-401c-bd9d-b17c3bb807ac\") " pod="openstack/nova-metadata-0" Nov 22 04:29:55 crc kubenswrapper[4699]: I1122 04:29:55.967488 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba50e6f7-ac95-401c-bd9d-b17c3bb807ac-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ba50e6f7-ac95-401c-bd9d-b17c3bb807ac\") " pod="openstack/nova-metadata-0" Nov 22 04:29:55 crc kubenswrapper[4699]: I1122 04:29:55.967593 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba50e6f7-ac95-401c-bd9d-b17c3bb807ac-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ba50e6f7-ac95-401c-bd9d-b17c3bb807ac\") " pod="openstack/nova-metadata-0" Nov 22 04:29:56 crc kubenswrapper[4699]: I1122 04:29:56.069464 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba50e6f7-ac95-401c-bd9d-b17c3bb807ac-config-data\") pod \"nova-metadata-0\" (UID: \"ba50e6f7-ac95-401c-bd9d-b17c3bb807ac\") " pod="openstack/nova-metadata-0" Nov 22 04:29:56 crc kubenswrapper[4699]: I1122 04:29:56.069556 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba50e6f7-ac95-401c-bd9d-b17c3bb807ac-logs\") pod \"nova-metadata-0\" (UID: \"ba50e6f7-ac95-401c-bd9d-b17c3bb807ac\") " pod="openstack/nova-metadata-0" Nov 22 04:29:56 crc kubenswrapper[4699]: I1122 04:29:56.069603 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba50e6f7-ac95-401c-bd9d-b17c3bb807ac-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ba50e6f7-ac95-401c-bd9d-b17c3bb807ac\") " pod="openstack/nova-metadata-0" Nov 22 04:29:56 crc kubenswrapper[4699]: I1122 04:29:56.069641 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba50e6f7-ac95-401c-bd9d-b17c3bb807ac-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ba50e6f7-ac95-401c-bd9d-b17c3bb807ac\") " pod="openstack/nova-metadata-0" Nov 22 04:29:56 crc kubenswrapper[4699]: I1122 04:29:56.069687 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7l9b\" (UniqueName: \"kubernetes.io/projected/ba50e6f7-ac95-401c-bd9d-b17c3bb807ac-kube-api-access-k7l9b\") pod \"nova-metadata-0\" (UID: \"ba50e6f7-ac95-401c-bd9d-b17c3bb807ac\") " pod="openstack/nova-metadata-0" Nov 22 04:29:56 crc kubenswrapper[4699]: I1122 04:29:56.070057 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba50e6f7-ac95-401c-bd9d-b17c3bb807ac-logs\") pod \"nova-metadata-0\" (UID: \"ba50e6f7-ac95-401c-bd9d-b17c3bb807ac\") " pod="openstack/nova-metadata-0" Nov 22 04:29:56 crc kubenswrapper[4699]: I1122 04:29:56.073796 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba50e6f7-ac95-401c-bd9d-b17c3bb807ac-config-data\") pod \"nova-metadata-0\" (UID: \"ba50e6f7-ac95-401c-bd9d-b17c3bb807ac\") " pod="openstack/nova-metadata-0" Nov 22 04:29:56 crc kubenswrapper[4699]: I1122 04:29:56.084998 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba50e6f7-ac95-401c-bd9d-b17c3bb807ac-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ba50e6f7-ac95-401c-bd9d-b17c3bb807ac\") " pod="openstack/nova-metadata-0" Nov 22 04:29:56 crc kubenswrapper[4699]: I1122 04:29:56.085328 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba50e6f7-ac95-401c-bd9d-b17c3bb807ac-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ba50e6f7-ac95-401c-bd9d-b17c3bb807ac\") " pod="openstack/nova-metadata-0" Nov 22 04:29:56 crc kubenswrapper[4699]: I1122 04:29:56.088083 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7l9b\" (UniqueName: \"kubernetes.io/projected/ba50e6f7-ac95-401c-bd9d-b17c3bb807ac-kube-api-access-k7l9b\") pod \"nova-metadata-0\" (UID: \"ba50e6f7-ac95-401c-bd9d-b17c3bb807ac\") " pod="openstack/nova-metadata-0" Nov 22 04:29:56 crc kubenswrapper[4699]: I1122 04:29:56.241702 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 04:29:56 crc kubenswrapper[4699]: I1122 04:29:56.756309 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 04:29:56 crc kubenswrapper[4699]: W1122 04:29:56.764678 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba50e6f7_ac95_401c_bd9d_b17c3bb807ac.slice/crio-b578ff822917a821e55debdadc783795044fad91717c7f7794e62cb5a59dbae0 WatchSource:0}: Error finding container b578ff822917a821e55debdadc783795044fad91717c7f7794e62cb5a59dbae0: Status 404 returned error can't find the container with id b578ff822917a821e55debdadc783795044fad91717c7f7794e62cb5a59dbae0 Nov 22 04:29:56 crc kubenswrapper[4699]: I1122 04:29:56.834479 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ba50e6f7-ac95-401c-bd9d-b17c3bb807ac","Type":"ContainerStarted","Data":"b578ff822917a821e55debdadc783795044fad91717c7f7794e62cb5a59dbae0"} Nov 22 04:29:57 crc kubenswrapper[4699]: I1122 04:29:57.457928 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="265829e9-7219-4f62-becf-4f5b3aed77ae" path="/var/lib/kubelet/pods/265829e9-7219-4f62-becf-4f5b3aed77ae/volumes" Nov 22 04:29:57 crc kubenswrapper[4699]: I1122 04:29:57.846596 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ba50e6f7-ac95-401c-bd9d-b17c3bb807ac","Type":"ContainerStarted","Data":"01dd3154d91bf0224aa17e55b3d24ffd16253265b20244b74ecaf2289c0fb564"} Nov 22 04:29:57 crc kubenswrapper[4699]: I1122 04:29:57.847706 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ba50e6f7-ac95-401c-bd9d-b17c3bb807ac","Type":"ContainerStarted","Data":"e3139b50e0b51a28813d268d00999d5a9f7464a70e3723e38932093f213c2686"} Nov 22 04:29:57 crc kubenswrapper[4699]: I1122 04:29:57.865770 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.865739624 podStartE2EDuration="2.865739624s" podCreationTimestamp="2025-11-22 04:29:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:29:57.864177727 +0000 UTC m=+1349.206798924" watchObservedRunningTime="2025-11-22 04:29:57.865739624 +0000 UTC m=+1349.208360811" Nov 22 04:29:58 crc kubenswrapper[4699]: I1122 04:29:58.856111 4699 generic.go:334] "Generic (PLEG): container finished" podID="88235903-ad93-439b-94a0-cd4afd05370f" containerID="2c5de007d8fb94ac8a47a48b15acdd5cbfebbea9818405299e4042a67be3a215" exitCode=0 Nov 22 04:29:58 crc kubenswrapper[4699]: I1122 04:29:58.856192 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-72t27" event={"ID":"88235903-ad93-439b-94a0-cd4afd05370f","Type":"ContainerDied","Data":"2c5de007d8fb94ac8a47a48b15acdd5cbfebbea9818405299e4042a67be3a215"} Nov 22 04:29:58 crc kubenswrapper[4699]: I1122 04:29:58.981289 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 22 04:29:58 crc kubenswrapper[4699]: I1122 04:29:58.981341 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 22 04:29:59 crc kubenswrapper[4699]: I1122 04:29:59.025216 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 22 04:29:59 crc kubenswrapper[4699]: I1122 04:29:59.060681 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 22 04:29:59 crc kubenswrapper[4699]: I1122 04:29:59.363417 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757b4f8459-m55tb" Nov 22 04:29:59 crc kubenswrapper[4699]: I1122 04:29:59.424157 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-84dp4"] Nov 22 04:29:59 crc kubenswrapper[4699]: I1122 04:29:59.424595 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-84dp4" podUID="96a27084-0580-4b64-9cde-906db9a6f231" containerName="dnsmasq-dns" containerID="cri-o://6b465583892dd38329a475f91a9adefc3fae48c4f64ddf86e6c7c142ed04623b" gracePeriod=10 Nov 22 04:29:59 crc kubenswrapper[4699]: I1122 04:29:59.867401 4699 generic.go:334] "Generic (PLEG): container finished" podID="96a27084-0580-4b64-9cde-906db9a6f231" containerID="6b465583892dd38329a475f91a9adefc3fae48c4f64ddf86e6c7c142ed04623b" exitCode=0 Nov 22 04:29:59 crc kubenswrapper[4699]: I1122 04:29:59.867508 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-84dp4" event={"ID":"96a27084-0580-4b64-9cde-906db9a6f231","Type":"ContainerDied","Data":"6b465583892dd38329a475f91a9adefc3fae48c4f64ddf86e6c7c142ed04623b"} Nov 22 04:29:59 crc kubenswrapper[4699]: I1122 04:29:59.915115 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.026013 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-84dp4" Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.063681 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="45f45400-db99-4fb5-b760-ac5c0f6083ff" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.064269 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="45f45400-db99-4fb5-b760-ac5c0f6083ff" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.105067 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/96a27084-0580-4b64-9cde-906db9a6f231-ovsdbserver-sb\") pod \"96a27084-0580-4b64-9cde-906db9a6f231\" (UID: \"96a27084-0580-4b64-9cde-906db9a6f231\") " Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.105145 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/96a27084-0580-4b64-9cde-906db9a6f231-dns-swift-storage-0\") pod \"96a27084-0580-4b64-9cde-906db9a6f231\" (UID: \"96a27084-0580-4b64-9cde-906db9a6f231\") " Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.105211 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96a27084-0580-4b64-9cde-906db9a6f231-config\") pod \"96a27084-0580-4b64-9cde-906db9a6f231\" (UID: \"96a27084-0580-4b64-9cde-906db9a6f231\") " Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.105247 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/96a27084-0580-4b64-9cde-906db9a6f231-ovsdbserver-nb\") pod \"96a27084-0580-4b64-9cde-906db9a6f231\" (UID: \"96a27084-0580-4b64-9cde-906db9a6f231\") " Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.105361 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22bq2\" (UniqueName: \"kubernetes.io/projected/96a27084-0580-4b64-9cde-906db9a6f231-kube-api-access-22bq2\") pod \"96a27084-0580-4b64-9cde-906db9a6f231\" (UID: \"96a27084-0580-4b64-9cde-906db9a6f231\") " Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.105498 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/96a27084-0580-4b64-9cde-906db9a6f231-dns-svc\") pod \"96a27084-0580-4b64-9cde-906db9a6f231\" (UID: \"96a27084-0580-4b64-9cde-906db9a6f231\") " Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.139188 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96a27084-0580-4b64-9cde-906db9a6f231-kube-api-access-22bq2" (OuterVolumeSpecName: "kube-api-access-22bq2") pod "96a27084-0580-4b64-9cde-906db9a6f231" (UID: "96a27084-0580-4b64-9cde-906db9a6f231"). InnerVolumeSpecName "kube-api-access-22bq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.156095 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396430-vmtpw"] Nov 22 04:30:00 crc kubenswrapper[4699]: E1122 04:30:00.165994 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96a27084-0580-4b64-9cde-906db9a6f231" containerName="dnsmasq-dns" Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.168914 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="96a27084-0580-4b64-9cde-906db9a6f231" containerName="dnsmasq-dns" Nov 22 04:30:00 crc kubenswrapper[4699]: E1122 04:30:00.169619 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96a27084-0580-4b64-9cde-906db9a6f231" containerName="init" Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.169759 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="96a27084-0580-4b64-9cde-906db9a6f231" containerName="init" Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.170715 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="96a27084-0580-4b64-9cde-906db9a6f231" containerName="dnsmasq-dns" Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.171784 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396430-vmtpw" Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.185059 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.185303 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.242612 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ae586a43-829a-4c64-9384-cacf973a9c67-config-volume\") pod \"collect-profiles-29396430-vmtpw\" (UID: \"ae586a43-829a-4c64-9384-cacf973a9c67\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396430-vmtpw" Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.242685 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ddz7\" (UniqueName: \"kubernetes.io/projected/ae586a43-829a-4c64-9384-cacf973a9c67-kube-api-access-6ddz7\") pod \"collect-profiles-29396430-vmtpw\" (UID: \"ae586a43-829a-4c64-9384-cacf973a9c67\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396430-vmtpw" Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.242876 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ae586a43-829a-4c64-9384-cacf973a9c67-secret-volume\") pod \"collect-profiles-29396430-vmtpw\" (UID: \"ae586a43-829a-4c64-9384-cacf973a9c67\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396430-vmtpw" Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.243180 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22bq2\" (UniqueName: \"kubernetes.io/projected/96a27084-0580-4b64-9cde-906db9a6f231-kube-api-access-22bq2\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.248688 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396430-vmtpw"] Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.276413 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96a27084-0580-4b64-9cde-906db9a6f231-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "96a27084-0580-4b64-9cde-906db9a6f231" (UID: "96a27084-0580-4b64-9cde-906db9a6f231"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.284142 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96a27084-0580-4b64-9cde-906db9a6f231-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "96a27084-0580-4b64-9cde-906db9a6f231" (UID: "96a27084-0580-4b64-9cde-906db9a6f231"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.288890 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96a27084-0580-4b64-9cde-906db9a6f231-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "96a27084-0580-4b64-9cde-906db9a6f231" (UID: "96a27084-0580-4b64-9cde-906db9a6f231"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.314890 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96a27084-0580-4b64-9cde-906db9a6f231-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "96a27084-0580-4b64-9cde-906db9a6f231" (UID: "96a27084-0580-4b64-9cde-906db9a6f231"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.334814 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96a27084-0580-4b64-9cde-906db9a6f231-config" (OuterVolumeSpecName: "config") pod "96a27084-0580-4b64-9cde-906db9a6f231" (UID: "96a27084-0580-4b64-9cde-906db9a6f231"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.342121 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-72t27" Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.344854 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ddz7\" (UniqueName: \"kubernetes.io/projected/ae586a43-829a-4c64-9384-cacf973a9c67-kube-api-access-6ddz7\") pod \"collect-profiles-29396430-vmtpw\" (UID: \"ae586a43-829a-4c64-9384-cacf973a9c67\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396430-vmtpw" Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.344980 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ae586a43-829a-4c64-9384-cacf973a9c67-secret-volume\") pod \"collect-profiles-29396430-vmtpw\" (UID: \"ae586a43-829a-4c64-9384-cacf973a9c67\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396430-vmtpw" Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.345161 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ae586a43-829a-4c64-9384-cacf973a9c67-config-volume\") pod \"collect-profiles-29396430-vmtpw\" (UID: \"ae586a43-829a-4c64-9384-cacf973a9c67\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396430-vmtpw" Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.345251 4699 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/96a27084-0580-4b64-9cde-906db9a6f231-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.345273 4699 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/96a27084-0580-4b64-9cde-906db9a6f231-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.345289 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96a27084-0580-4b64-9cde-906db9a6f231-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.345302 4699 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/96a27084-0580-4b64-9cde-906db9a6f231-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.345402 4699 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/96a27084-0580-4b64-9cde-906db9a6f231-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.347078 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ae586a43-829a-4c64-9384-cacf973a9c67-config-volume\") pod \"collect-profiles-29396430-vmtpw\" (UID: \"ae586a43-829a-4c64-9384-cacf973a9c67\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396430-vmtpw" Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.351939 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ae586a43-829a-4c64-9384-cacf973a9c67-secret-volume\") pod \"collect-profiles-29396430-vmtpw\" (UID: \"ae586a43-829a-4c64-9384-cacf973a9c67\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396430-vmtpw" Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.373143 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ddz7\" (UniqueName: \"kubernetes.io/projected/ae586a43-829a-4c64-9384-cacf973a9c67-kube-api-access-6ddz7\") pod \"collect-profiles-29396430-vmtpw\" (UID: \"ae586a43-829a-4c64-9384-cacf973a9c67\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396430-vmtpw" Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.446427 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88235903-ad93-439b-94a0-cd4afd05370f-scripts\") pod \"88235903-ad93-439b-94a0-cd4afd05370f\" (UID: \"88235903-ad93-439b-94a0-cd4afd05370f\") " Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.446577 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88235903-ad93-439b-94a0-cd4afd05370f-config-data\") pod \"88235903-ad93-439b-94a0-cd4afd05370f\" (UID: \"88235903-ad93-439b-94a0-cd4afd05370f\") " Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.446619 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbc67\" (UniqueName: \"kubernetes.io/projected/88235903-ad93-439b-94a0-cd4afd05370f-kube-api-access-tbc67\") pod \"88235903-ad93-439b-94a0-cd4afd05370f\" (UID: \"88235903-ad93-439b-94a0-cd4afd05370f\") " Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.446643 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88235903-ad93-439b-94a0-cd4afd05370f-combined-ca-bundle\") pod \"88235903-ad93-439b-94a0-cd4afd05370f\" (UID: \"88235903-ad93-439b-94a0-cd4afd05370f\") " Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.451611 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88235903-ad93-439b-94a0-cd4afd05370f-scripts" (OuterVolumeSpecName: "scripts") pod "88235903-ad93-439b-94a0-cd4afd05370f" (UID: "88235903-ad93-439b-94a0-cd4afd05370f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.455676 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88235903-ad93-439b-94a0-cd4afd05370f-kube-api-access-tbc67" (OuterVolumeSpecName: "kube-api-access-tbc67") pod "88235903-ad93-439b-94a0-cd4afd05370f" (UID: "88235903-ad93-439b-94a0-cd4afd05370f"). InnerVolumeSpecName "kube-api-access-tbc67". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.475868 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88235903-ad93-439b-94a0-cd4afd05370f-config-data" (OuterVolumeSpecName: "config-data") pod "88235903-ad93-439b-94a0-cd4afd05370f" (UID: "88235903-ad93-439b-94a0-cd4afd05370f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.484219 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88235903-ad93-439b-94a0-cd4afd05370f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88235903-ad93-439b-94a0-cd4afd05370f" (UID: "88235903-ad93-439b-94a0-cd4afd05370f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.549327 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88235903-ad93-439b-94a0-cd4afd05370f-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.549364 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88235903-ad93-439b-94a0-cd4afd05370f-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.549379 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbc67\" (UniqueName: \"kubernetes.io/projected/88235903-ad93-439b-94a0-cd4afd05370f-kube-api-access-tbc67\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.549390 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88235903-ad93-439b-94a0-cd4afd05370f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.639911 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396430-vmtpw" Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.900918 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-72t27" event={"ID":"88235903-ad93-439b-94a0-cd4afd05370f","Type":"ContainerDied","Data":"2fd47d709b91d2f454a310a2d470c5405695da62aac4bc990772cfc28ea05fee"} Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.900986 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2fd47d709b91d2f454a310a2d470c5405695da62aac4bc990772cfc28ea05fee" Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.901071 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-72t27" Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.907725 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-84dp4" event={"ID":"96a27084-0580-4b64-9cde-906db9a6f231","Type":"ContainerDied","Data":"17bb57c2e772c5534959caa05624ca29d250ad48563f05ebece5793bf6078cb3"} Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.908237 4699 scope.go:117] "RemoveContainer" containerID="6b465583892dd38329a475f91a9adefc3fae48c4f64ddf86e6c7c142ed04623b" Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.907806 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-84dp4" Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.910596 4699 generic.go:334] "Generic (PLEG): container finished" podID="cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1" containerID="4f8c36580a32469f7b1a2b1c6ff3454a6d29bd61b38ca805d1c2fc218f05457c" exitCode=0 Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.910660 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zvrvx" event={"ID":"cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1","Type":"ContainerDied","Data":"4f8c36580a32469f7b1a2b1c6ff3454a6d29bd61b38ca805d1c2fc218f05457c"} Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.946497 4699 scope.go:117] "RemoveContainer" containerID="05f6d2b47545b51eb320788f3230ef396c1f699ec82849b48f100eb8ac03cb5f" Nov 22 04:30:00 crc kubenswrapper[4699]: I1122 04:30:00.981621 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-84dp4"] Nov 22 04:30:01 crc kubenswrapper[4699]: I1122 04:30:00.999547 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-84dp4"] Nov 22 04:30:01 crc kubenswrapper[4699]: I1122 04:30:01.090018 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 22 04:30:01 crc kubenswrapper[4699]: I1122 04:30:01.090530 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="45f45400-db99-4fb5-b760-ac5c0f6083ff" containerName="nova-api-log" containerID="cri-o://97f7f9f27e420661e41d2c446ead9e9fb95086e918d77bf31b6a9e96c3bdf8b9" gracePeriod=30 Nov 22 04:30:01 crc kubenswrapper[4699]: I1122 04:30:01.091456 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="45f45400-db99-4fb5-b760-ac5c0f6083ff" containerName="nova-api-api" containerID="cri-o://28c4b4e224afb6e386fc28d1e5ab5aaeccfc503c99fe360191a73e644398c5d5" gracePeriod=30 Nov 22 04:30:01 crc kubenswrapper[4699]: I1122 04:30:01.097298 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 04:30:01 crc kubenswrapper[4699]: I1122 04:30:01.125920 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 04:30:01 crc kubenswrapper[4699]: I1122 04:30:01.126450 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ba50e6f7-ac95-401c-bd9d-b17c3bb807ac" containerName="nova-metadata-metadata" containerID="cri-o://01dd3154d91bf0224aa17e55b3d24ffd16253265b20244b74ecaf2289c0fb564" gracePeriod=30 Nov 22 04:30:01 crc kubenswrapper[4699]: I1122 04:30:01.126755 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ba50e6f7-ac95-401c-bd9d-b17c3bb807ac" containerName="nova-metadata-log" containerID="cri-o://e3139b50e0b51a28813d268d00999d5a9f7464a70e3723e38932093f213c2686" gracePeriod=30 Nov 22 04:30:01 crc kubenswrapper[4699]: I1122 04:30:01.181061 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396430-vmtpw"] Nov 22 04:30:01 crc kubenswrapper[4699]: I1122 04:30:01.242884 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 22 04:30:01 crc kubenswrapper[4699]: I1122 04:30:01.245635 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 22 04:30:01 crc kubenswrapper[4699]: I1122 04:30:01.468170 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96a27084-0580-4b64-9cde-906db9a6f231" path="/var/lib/kubelet/pods/96a27084-0580-4b64-9cde-906db9a6f231/volumes" Nov 22 04:30:01 crc kubenswrapper[4699]: E1122 04:30:01.775892 4699 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba50e6f7_ac95_401c_bd9d_b17c3bb807ac.slice/crio-conmon-e3139b50e0b51a28813d268d00999d5a9f7464a70e3723e38932093f213c2686.scope\": RecentStats: unable to find data in memory cache]" Nov 22 04:30:01 crc kubenswrapper[4699]: I1122 04:30:01.923418 4699 generic.go:334] "Generic (PLEG): container finished" podID="45f45400-db99-4fb5-b760-ac5c0f6083ff" containerID="97f7f9f27e420661e41d2c446ead9e9fb95086e918d77bf31b6a9e96c3bdf8b9" exitCode=143 Nov 22 04:30:01 crc kubenswrapper[4699]: I1122 04:30:01.925025 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"45f45400-db99-4fb5-b760-ac5c0f6083ff","Type":"ContainerDied","Data":"97f7f9f27e420661e41d2c446ead9e9fb95086e918d77bf31b6a9e96c3bdf8b9"} Nov 22 04:30:01 crc kubenswrapper[4699]: I1122 04:30:01.927972 4699 generic.go:334] "Generic (PLEG): container finished" podID="ba50e6f7-ac95-401c-bd9d-b17c3bb807ac" containerID="01dd3154d91bf0224aa17e55b3d24ffd16253265b20244b74ecaf2289c0fb564" exitCode=0 Nov 22 04:30:01 crc kubenswrapper[4699]: I1122 04:30:01.928004 4699 generic.go:334] "Generic (PLEG): container finished" podID="ba50e6f7-ac95-401c-bd9d-b17c3bb807ac" containerID="e3139b50e0b51a28813d268d00999d5a9f7464a70e3723e38932093f213c2686" exitCode=143 Nov 22 04:30:01 crc kubenswrapper[4699]: I1122 04:30:01.928050 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ba50e6f7-ac95-401c-bd9d-b17c3bb807ac","Type":"ContainerDied","Data":"01dd3154d91bf0224aa17e55b3d24ffd16253265b20244b74ecaf2289c0fb564"} Nov 22 04:30:01 crc kubenswrapper[4699]: I1122 04:30:01.928087 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ba50e6f7-ac95-401c-bd9d-b17c3bb807ac","Type":"ContainerDied","Data":"e3139b50e0b51a28813d268d00999d5a9f7464a70e3723e38932093f213c2686"} Nov 22 04:30:01 crc kubenswrapper[4699]: I1122 04:30:01.929945 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="f08adc05-ca24-491e-bf3c-ea49d60d9ca8" containerName="nova-scheduler-scheduler" containerID="cri-o://f1e0782e4b7cbb15a275d8f80e9f0ef0b72d79f4ae89c424dfb567897ea3e295" gracePeriod=30 Nov 22 04:30:01 crc kubenswrapper[4699]: I1122 04:30:01.930250 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396430-vmtpw" event={"ID":"ae586a43-829a-4c64-9384-cacf973a9c67","Type":"ContainerStarted","Data":"63e95cc071c1e06b5709beac1648748635660c02b13cb55355eee3379d543e2b"} Nov 22 04:30:01 crc kubenswrapper[4699]: I1122 04:30:01.930282 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396430-vmtpw" event={"ID":"ae586a43-829a-4c64-9384-cacf973a9c67","Type":"ContainerStarted","Data":"94a19dfc52720d1208735db960ebb5a7c2bd418244f303bd239481a590995ac7"} Nov 22 04:30:02 crc kubenswrapper[4699]: I1122 04:30:02.462624 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zvrvx" Nov 22 04:30:02 crc kubenswrapper[4699]: I1122 04:30:02.475078 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 04:30:02 crc kubenswrapper[4699]: I1122 04:30:02.594240 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba50e6f7-ac95-401c-bd9d-b17c3bb807ac-logs\") pod \"ba50e6f7-ac95-401c-bd9d-b17c3bb807ac\" (UID: \"ba50e6f7-ac95-401c-bd9d-b17c3bb807ac\") " Nov 22 04:30:02 crc kubenswrapper[4699]: I1122 04:30:02.594354 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1-scripts\") pod \"cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1\" (UID: \"cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1\") " Nov 22 04:30:02 crc kubenswrapper[4699]: I1122 04:30:02.594450 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba50e6f7-ac95-401c-bd9d-b17c3bb807ac-config-data\") pod \"ba50e6f7-ac95-401c-bd9d-b17c3bb807ac\" (UID: \"ba50e6f7-ac95-401c-bd9d-b17c3bb807ac\") " Nov 22 04:30:02 crc kubenswrapper[4699]: I1122 04:30:02.594493 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba50e6f7-ac95-401c-bd9d-b17c3bb807ac-nova-metadata-tls-certs\") pod \"ba50e6f7-ac95-401c-bd9d-b17c3bb807ac\" (UID: \"ba50e6f7-ac95-401c-bd9d-b17c3bb807ac\") " Nov 22 04:30:02 crc kubenswrapper[4699]: I1122 04:30:02.594597 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba50e6f7-ac95-401c-bd9d-b17c3bb807ac-combined-ca-bundle\") pod \"ba50e6f7-ac95-401c-bd9d-b17c3bb807ac\" (UID: \"ba50e6f7-ac95-401c-bd9d-b17c3bb807ac\") " Nov 22 04:30:02 crc kubenswrapper[4699]: I1122 04:30:02.594628 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1-config-data\") pod \"cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1\" (UID: \"cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1\") " Nov 22 04:30:02 crc kubenswrapper[4699]: I1122 04:30:02.594720 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7l9b\" (UniqueName: \"kubernetes.io/projected/ba50e6f7-ac95-401c-bd9d-b17c3bb807ac-kube-api-access-k7l9b\") pod \"ba50e6f7-ac95-401c-bd9d-b17c3bb807ac\" (UID: \"ba50e6f7-ac95-401c-bd9d-b17c3bb807ac\") " Nov 22 04:30:02 crc kubenswrapper[4699]: I1122 04:30:02.595105 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba50e6f7-ac95-401c-bd9d-b17c3bb807ac-logs" (OuterVolumeSpecName: "logs") pod "ba50e6f7-ac95-401c-bd9d-b17c3bb807ac" (UID: "ba50e6f7-ac95-401c-bd9d-b17c3bb807ac"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:30:02 crc kubenswrapper[4699]: I1122 04:30:02.595977 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1-combined-ca-bundle\") pod \"cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1\" (UID: \"cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1\") " Nov 22 04:30:02 crc kubenswrapper[4699]: I1122 04:30:02.596010 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ml2v\" (UniqueName: \"kubernetes.io/projected/cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1-kube-api-access-4ml2v\") pod \"cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1\" (UID: \"cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1\") " Nov 22 04:30:02 crc kubenswrapper[4699]: I1122 04:30:02.601188 4699 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba50e6f7-ac95-401c-bd9d-b17c3bb807ac-logs\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:02 crc kubenswrapper[4699]: I1122 04:30:02.602679 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba50e6f7-ac95-401c-bd9d-b17c3bb807ac-kube-api-access-k7l9b" (OuterVolumeSpecName: "kube-api-access-k7l9b") pod "ba50e6f7-ac95-401c-bd9d-b17c3bb807ac" (UID: "ba50e6f7-ac95-401c-bd9d-b17c3bb807ac"). InnerVolumeSpecName "kube-api-access-k7l9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:30:02 crc kubenswrapper[4699]: I1122 04:30:02.603542 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1-scripts" (OuterVolumeSpecName: "scripts") pod "cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1" (UID: "cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:30:02 crc kubenswrapper[4699]: I1122 04:30:02.606134 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1-kube-api-access-4ml2v" (OuterVolumeSpecName: "kube-api-access-4ml2v") pod "cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1" (UID: "cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1"). InnerVolumeSpecName "kube-api-access-4ml2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:30:02 crc kubenswrapper[4699]: I1122 04:30:02.631400 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba50e6f7-ac95-401c-bd9d-b17c3bb807ac-config-data" (OuterVolumeSpecName: "config-data") pod "ba50e6f7-ac95-401c-bd9d-b17c3bb807ac" (UID: "ba50e6f7-ac95-401c-bd9d-b17c3bb807ac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:30:02 crc kubenswrapper[4699]: I1122 04:30:02.631662 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1-config-data" (OuterVolumeSpecName: "config-data") pod "cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1" (UID: "cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:30:02 crc kubenswrapper[4699]: I1122 04:30:02.635875 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba50e6f7-ac95-401c-bd9d-b17c3bb807ac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba50e6f7-ac95-401c-bd9d-b17c3bb807ac" (UID: "ba50e6f7-ac95-401c-bd9d-b17c3bb807ac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:30:02 crc kubenswrapper[4699]: I1122 04:30:02.640901 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1" (UID: "cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:30:02 crc kubenswrapper[4699]: I1122 04:30:02.684775 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba50e6f7-ac95-401c-bd9d-b17c3bb807ac-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "ba50e6f7-ac95-401c-bd9d-b17c3bb807ac" (UID: "ba50e6f7-ac95-401c-bd9d-b17c3bb807ac"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:30:02 crc kubenswrapper[4699]: I1122 04:30:02.704021 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:02 crc kubenswrapper[4699]: I1122 04:30:02.704380 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba50e6f7-ac95-401c-bd9d-b17c3bb807ac-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:02 crc kubenswrapper[4699]: I1122 04:30:02.704475 4699 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba50e6f7-ac95-401c-bd9d-b17c3bb807ac-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:02 crc kubenswrapper[4699]: I1122 04:30:02.704540 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba50e6f7-ac95-401c-bd9d-b17c3bb807ac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:02 crc kubenswrapper[4699]: I1122 04:30:02.704598 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:02 crc kubenswrapper[4699]: I1122 04:30:02.704675 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7l9b\" (UniqueName: \"kubernetes.io/projected/ba50e6f7-ac95-401c-bd9d-b17c3bb807ac-kube-api-access-k7l9b\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:02 crc kubenswrapper[4699]: I1122 04:30:02.704742 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:02 crc kubenswrapper[4699]: I1122 04:30:02.704800 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ml2v\" (UniqueName: \"kubernetes.io/projected/cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1-kube-api-access-4ml2v\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:02 crc kubenswrapper[4699]: I1122 04:30:02.942934 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ba50e6f7-ac95-401c-bd9d-b17c3bb807ac","Type":"ContainerDied","Data":"b578ff822917a821e55debdadc783795044fad91717c7f7794e62cb5a59dbae0"} Nov 22 04:30:02 crc kubenswrapper[4699]: I1122 04:30:02.943019 4699 scope.go:117] "RemoveContainer" containerID="01dd3154d91bf0224aa17e55b3d24ffd16253265b20244b74ecaf2289c0fb564" Nov 22 04:30:02 crc kubenswrapper[4699]: I1122 04:30:02.943249 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 04:30:02 crc kubenswrapper[4699]: I1122 04:30:02.945073 4699 generic.go:334] "Generic (PLEG): container finished" podID="ae586a43-829a-4c64-9384-cacf973a9c67" containerID="63e95cc071c1e06b5709beac1648748635660c02b13cb55355eee3379d543e2b" exitCode=0 Nov 22 04:30:02 crc kubenswrapper[4699]: I1122 04:30:02.945173 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396430-vmtpw" event={"ID":"ae586a43-829a-4c64-9384-cacf973a9c67","Type":"ContainerDied","Data":"63e95cc071c1e06b5709beac1648748635660c02b13cb55355eee3379d543e2b"} Nov 22 04:30:02 crc kubenswrapper[4699]: I1122 04:30:02.950909 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zvrvx" event={"ID":"cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1","Type":"ContainerDied","Data":"250ae7f8ace6216652a7d09d8e2afd30e70170f851f05c6ed8411df7ccf6fabe"} Nov 22 04:30:02 crc kubenswrapper[4699]: I1122 04:30:02.950956 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="250ae7f8ace6216652a7d09d8e2afd30e70170f851f05c6ed8411df7ccf6fabe" Nov 22 04:30:02 crc kubenswrapper[4699]: I1122 04:30:02.951005 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zvrvx" Nov 22 04:30:02 crc kubenswrapper[4699]: I1122 04:30:02.990642 4699 scope.go:117] "RemoveContainer" containerID="e3139b50e0b51a28813d268d00999d5a9f7464a70e3723e38932093f213c2686" Nov 22 04:30:03 crc kubenswrapper[4699]: I1122 04:30:03.024012 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 04:30:03 crc kubenswrapper[4699]: I1122 04:30:03.044522 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 04:30:03 crc kubenswrapper[4699]: I1122 04:30:03.072201 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 22 04:30:03 crc kubenswrapper[4699]: E1122 04:30:03.072761 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1" containerName="nova-cell1-conductor-db-sync" Nov 22 04:30:03 crc kubenswrapper[4699]: I1122 04:30:03.072823 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1" containerName="nova-cell1-conductor-db-sync" Nov 22 04:30:03 crc kubenswrapper[4699]: E1122 04:30:03.072843 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba50e6f7-ac95-401c-bd9d-b17c3bb807ac" containerName="nova-metadata-log" Nov 22 04:30:03 crc kubenswrapper[4699]: I1122 04:30:03.072855 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba50e6f7-ac95-401c-bd9d-b17c3bb807ac" containerName="nova-metadata-log" Nov 22 04:30:03 crc kubenswrapper[4699]: E1122 04:30:03.072880 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88235903-ad93-439b-94a0-cd4afd05370f" containerName="nova-manage" Nov 22 04:30:03 crc kubenswrapper[4699]: I1122 04:30:03.072888 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="88235903-ad93-439b-94a0-cd4afd05370f" containerName="nova-manage" Nov 22 04:30:03 crc kubenswrapper[4699]: E1122 04:30:03.072930 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba50e6f7-ac95-401c-bd9d-b17c3bb807ac" containerName="nova-metadata-metadata" Nov 22 04:30:03 crc kubenswrapper[4699]: I1122 04:30:03.072938 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba50e6f7-ac95-401c-bd9d-b17c3bb807ac" containerName="nova-metadata-metadata" Nov 22 04:30:03 crc kubenswrapper[4699]: I1122 04:30:03.073383 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1" containerName="nova-cell1-conductor-db-sync" Nov 22 04:30:03 crc kubenswrapper[4699]: I1122 04:30:03.073405 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba50e6f7-ac95-401c-bd9d-b17c3bb807ac" containerName="nova-metadata-metadata" Nov 22 04:30:03 crc kubenswrapper[4699]: I1122 04:30:03.073417 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="88235903-ad93-439b-94a0-cd4afd05370f" containerName="nova-manage" Nov 22 04:30:03 crc kubenswrapper[4699]: I1122 04:30:03.073476 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba50e6f7-ac95-401c-bd9d-b17c3bb807ac" containerName="nova-metadata-log" Nov 22 04:30:03 crc kubenswrapper[4699]: I1122 04:30:03.074250 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 22 04:30:03 crc kubenswrapper[4699]: I1122 04:30:03.079546 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 22 04:30:03 crc kubenswrapper[4699]: I1122 04:30:03.090149 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 22 04:30:03 crc kubenswrapper[4699]: I1122 04:30:03.130319 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 22 04:30:03 crc kubenswrapper[4699]: I1122 04:30:03.132606 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 04:30:03 crc kubenswrapper[4699]: I1122 04:30:03.137017 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 22 04:30:03 crc kubenswrapper[4699]: I1122 04:30:03.138788 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 22 04:30:03 crc kubenswrapper[4699]: I1122 04:30:03.161720 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 04:30:03 crc kubenswrapper[4699]: I1122 04:30:03.227503 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d03f6427-651c-4de6-851f-a2961d706e99-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d03f6427-651c-4de6-851f-a2961d706e99\") " pod="openstack/nova-metadata-0" Nov 22 04:30:03 crc kubenswrapper[4699]: I1122 04:30:03.227578 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls9ns\" (UniqueName: \"kubernetes.io/projected/88bb930e-50f0-4126-8410-cc3dbb3e864b-kube-api-access-ls9ns\") pod \"nova-cell1-conductor-0\" (UID: \"88bb930e-50f0-4126-8410-cc3dbb3e864b\") " pod="openstack/nova-cell1-conductor-0" Nov 22 04:30:03 crc kubenswrapper[4699]: I1122 04:30:03.227616 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d03f6427-651c-4de6-851f-a2961d706e99-config-data\") pod \"nova-metadata-0\" (UID: \"d03f6427-651c-4de6-851f-a2961d706e99\") " pod="openstack/nova-metadata-0" Nov 22 04:30:03 crc kubenswrapper[4699]: I1122 04:30:03.228024 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d03f6427-651c-4de6-851f-a2961d706e99-logs\") pod \"nova-metadata-0\" (UID: \"d03f6427-651c-4de6-851f-a2961d706e99\") " pod="openstack/nova-metadata-0" Nov 22 04:30:03 crc kubenswrapper[4699]: I1122 04:30:03.228087 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgdzl\" (UniqueName: \"kubernetes.io/projected/d03f6427-651c-4de6-851f-a2961d706e99-kube-api-access-vgdzl\") pod \"nova-metadata-0\" (UID: \"d03f6427-651c-4de6-851f-a2961d706e99\") " pod="openstack/nova-metadata-0" Nov 22 04:30:03 crc kubenswrapper[4699]: I1122 04:30:03.228114 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88bb930e-50f0-4126-8410-cc3dbb3e864b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"88bb930e-50f0-4126-8410-cc3dbb3e864b\") " pod="openstack/nova-cell1-conductor-0" Nov 22 04:30:03 crc kubenswrapper[4699]: I1122 04:30:03.228231 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88bb930e-50f0-4126-8410-cc3dbb3e864b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"88bb930e-50f0-4126-8410-cc3dbb3e864b\") " pod="openstack/nova-cell1-conductor-0" Nov 22 04:30:03 crc kubenswrapper[4699]: I1122 04:30:03.228272 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d03f6427-651c-4de6-851f-a2961d706e99-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d03f6427-651c-4de6-851f-a2961d706e99\") " pod="openstack/nova-metadata-0" Nov 22 04:30:03 crc kubenswrapper[4699]: I1122 04:30:03.329504 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls9ns\" (UniqueName: \"kubernetes.io/projected/88bb930e-50f0-4126-8410-cc3dbb3e864b-kube-api-access-ls9ns\") pod \"nova-cell1-conductor-0\" (UID: \"88bb930e-50f0-4126-8410-cc3dbb3e864b\") " pod="openstack/nova-cell1-conductor-0" Nov 22 04:30:03 crc kubenswrapper[4699]: I1122 04:30:03.329579 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d03f6427-651c-4de6-851f-a2961d706e99-config-data\") pod \"nova-metadata-0\" (UID: \"d03f6427-651c-4de6-851f-a2961d706e99\") " pod="openstack/nova-metadata-0" Nov 22 04:30:03 crc kubenswrapper[4699]: I1122 04:30:03.329686 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d03f6427-651c-4de6-851f-a2961d706e99-logs\") pod \"nova-metadata-0\" (UID: \"d03f6427-651c-4de6-851f-a2961d706e99\") " pod="openstack/nova-metadata-0" Nov 22 04:30:03 crc kubenswrapper[4699]: I1122 04:30:03.329705 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgdzl\" (UniqueName: \"kubernetes.io/projected/d03f6427-651c-4de6-851f-a2961d706e99-kube-api-access-vgdzl\") pod \"nova-metadata-0\" (UID: \"d03f6427-651c-4de6-851f-a2961d706e99\") " pod="openstack/nova-metadata-0" Nov 22 04:30:03 crc kubenswrapper[4699]: I1122 04:30:03.329721 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88bb930e-50f0-4126-8410-cc3dbb3e864b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"88bb930e-50f0-4126-8410-cc3dbb3e864b\") " pod="openstack/nova-cell1-conductor-0" Nov 22 04:30:03 crc kubenswrapper[4699]: I1122 04:30:03.329760 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88bb930e-50f0-4126-8410-cc3dbb3e864b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"88bb930e-50f0-4126-8410-cc3dbb3e864b\") " pod="openstack/nova-cell1-conductor-0" Nov 22 04:30:03 crc kubenswrapper[4699]: I1122 04:30:03.329780 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d03f6427-651c-4de6-851f-a2961d706e99-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d03f6427-651c-4de6-851f-a2961d706e99\") " pod="openstack/nova-metadata-0" Nov 22 04:30:03 crc kubenswrapper[4699]: I1122 04:30:03.329802 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d03f6427-651c-4de6-851f-a2961d706e99-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d03f6427-651c-4de6-851f-a2961d706e99\") " pod="openstack/nova-metadata-0" Nov 22 04:30:03 crc kubenswrapper[4699]: I1122 04:30:03.330681 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d03f6427-651c-4de6-851f-a2961d706e99-logs\") pod \"nova-metadata-0\" (UID: \"d03f6427-651c-4de6-851f-a2961d706e99\") " pod="openstack/nova-metadata-0" Nov 22 04:30:03 crc kubenswrapper[4699]: I1122 04:30:03.337591 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d03f6427-651c-4de6-851f-a2961d706e99-config-data\") pod \"nova-metadata-0\" (UID: \"d03f6427-651c-4de6-851f-a2961d706e99\") " pod="openstack/nova-metadata-0" Nov 22 04:30:03 crc kubenswrapper[4699]: I1122 04:30:03.337877 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88bb930e-50f0-4126-8410-cc3dbb3e864b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"88bb930e-50f0-4126-8410-cc3dbb3e864b\") " pod="openstack/nova-cell1-conductor-0" Nov 22 04:30:03 crc kubenswrapper[4699]: I1122 04:30:03.338108 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88bb930e-50f0-4126-8410-cc3dbb3e864b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"88bb930e-50f0-4126-8410-cc3dbb3e864b\") " pod="openstack/nova-cell1-conductor-0" Nov 22 04:30:03 crc kubenswrapper[4699]: I1122 04:30:03.342040 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d03f6427-651c-4de6-851f-a2961d706e99-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d03f6427-651c-4de6-851f-a2961d706e99\") " pod="openstack/nova-metadata-0" Nov 22 04:30:03 crc kubenswrapper[4699]: I1122 04:30:03.343169 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d03f6427-651c-4de6-851f-a2961d706e99-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d03f6427-651c-4de6-851f-a2961d706e99\") " pod="openstack/nova-metadata-0" Nov 22 04:30:03 crc kubenswrapper[4699]: I1122 04:30:03.351122 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls9ns\" (UniqueName: \"kubernetes.io/projected/88bb930e-50f0-4126-8410-cc3dbb3e864b-kube-api-access-ls9ns\") pod \"nova-cell1-conductor-0\" (UID: \"88bb930e-50f0-4126-8410-cc3dbb3e864b\") " pod="openstack/nova-cell1-conductor-0" Nov 22 04:30:03 crc kubenswrapper[4699]: I1122 04:30:03.351734 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgdzl\" (UniqueName: \"kubernetes.io/projected/d03f6427-651c-4de6-851f-a2961d706e99-kube-api-access-vgdzl\") pod \"nova-metadata-0\" (UID: \"d03f6427-651c-4de6-851f-a2961d706e99\") " pod="openstack/nova-metadata-0" Nov 22 04:30:03 crc kubenswrapper[4699]: I1122 04:30:03.401166 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 22 04:30:03 crc kubenswrapper[4699]: I1122 04:30:03.462732 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba50e6f7-ac95-401c-bd9d-b17c3bb807ac" path="/var/lib/kubelet/pods/ba50e6f7-ac95-401c-bd9d-b17c3bb807ac/volumes" Nov 22 04:30:03 crc kubenswrapper[4699]: I1122 04:30:03.466589 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 04:30:03 crc kubenswrapper[4699]: I1122 04:30:03.912016 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 22 04:30:03 crc kubenswrapper[4699]: W1122 04:30:03.918781 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88bb930e_50f0_4126_8410_cc3dbb3e864b.slice/crio-434005cce466dd845bc5bf4608f4bff025202d8c813cc5c5aca575127ff096c8 WatchSource:0}: Error finding container 434005cce466dd845bc5bf4608f4bff025202d8c813cc5c5aca575127ff096c8: Status 404 returned error can't find the container with id 434005cce466dd845bc5bf4608f4bff025202d8c813cc5c5aca575127ff096c8 Nov 22 04:30:03 crc kubenswrapper[4699]: I1122 04:30:03.966480 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"88bb930e-50f0-4126-8410-cc3dbb3e864b","Type":"ContainerStarted","Data":"434005cce466dd845bc5bf4608f4bff025202d8c813cc5c5aca575127ff096c8"} Nov 22 04:30:04 crc kubenswrapper[4699]: E1122 04:30:04.040003 4699 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f1e0782e4b7cbb15a275d8f80e9f0ef0b72d79f4ae89c424dfb567897ea3e295" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 22 04:30:04 crc kubenswrapper[4699]: E1122 04:30:04.051055 4699 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f1e0782e4b7cbb15a275d8f80e9f0ef0b72d79f4ae89c424dfb567897ea3e295" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 22 04:30:04 crc kubenswrapper[4699]: E1122 04:30:04.053696 4699 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f1e0782e4b7cbb15a275d8f80e9f0ef0b72d79f4ae89c424dfb567897ea3e295" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 22 04:30:04 crc kubenswrapper[4699]: E1122 04:30:04.053778 4699 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="f08adc05-ca24-491e-bf3c-ea49d60d9ca8" containerName="nova-scheduler-scheduler" Nov 22 04:30:04 crc kubenswrapper[4699]: I1122 04:30:04.064906 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 04:30:04 crc kubenswrapper[4699]: I1122 04:30:04.377886 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396430-vmtpw" Nov 22 04:30:04 crc kubenswrapper[4699]: I1122 04:30:04.461554 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ae586a43-829a-4c64-9384-cacf973a9c67-secret-volume\") pod \"ae586a43-829a-4c64-9384-cacf973a9c67\" (UID: \"ae586a43-829a-4c64-9384-cacf973a9c67\") " Nov 22 04:30:04 crc kubenswrapper[4699]: I1122 04:30:04.461673 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ddz7\" (UniqueName: \"kubernetes.io/projected/ae586a43-829a-4c64-9384-cacf973a9c67-kube-api-access-6ddz7\") pod \"ae586a43-829a-4c64-9384-cacf973a9c67\" (UID: \"ae586a43-829a-4c64-9384-cacf973a9c67\") " Nov 22 04:30:04 crc kubenswrapper[4699]: I1122 04:30:04.461727 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ae586a43-829a-4c64-9384-cacf973a9c67-config-volume\") pod \"ae586a43-829a-4c64-9384-cacf973a9c67\" (UID: \"ae586a43-829a-4c64-9384-cacf973a9c67\") " Nov 22 04:30:04 crc kubenswrapper[4699]: I1122 04:30:04.463393 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae586a43-829a-4c64-9384-cacf973a9c67-config-volume" (OuterVolumeSpecName: "config-volume") pod "ae586a43-829a-4c64-9384-cacf973a9c67" (UID: "ae586a43-829a-4c64-9384-cacf973a9c67"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:30:04 crc kubenswrapper[4699]: I1122 04:30:04.467122 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae586a43-829a-4c64-9384-cacf973a9c67-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ae586a43-829a-4c64-9384-cacf973a9c67" (UID: "ae586a43-829a-4c64-9384-cacf973a9c67"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:30:04 crc kubenswrapper[4699]: I1122 04:30:04.467579 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae586a43-829a-4c64-9384-cacf973a9c67-kube-api-access-6ddz7" (OuterVolumeSpecName: "kube-api-access-6ddz7") pod "ae586a43-829a-4c64-9384-cacf973a9c67" (UID: "ae586a43-829a-4c64-9384-cacf973a9c67"). InnerVolumeSpecName "kube-api-access-6ddz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:30:04 crc kubenswrapper[4699]: I1122 04:30:04.564491 4699 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ae586a43-829a-4c64-9384-cacf973a9c67-config-volume\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:04 crc kubenswrapper[4699]: I1122 04:30:04.564802 4699 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ae586a43-829a-4c64-9384-cacf973a9c67-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:04 crc kubenswrapper[4699]: I1122 04:30:04.564814 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ddz7\" (UniqueName: \"kubernetes.io/projected/ae586a43-829a-4c64-9384-cacf973a9c67-kube-api-access-6ddz7\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:04 crc kubenswrapper[4699]: I1122 04:30:04.980717 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"88bb930e-50f0-4126-8410-cc3dbb3e864b","Type":"ContainerStarted","Data":"e037f220a85d87b765e3351621dedf2965d033e891e0e74b1bcb3067806a80a7"} Nov 22 04:30:04 crc kubenswrapper[4699]: I1122 04:30:04.982143 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Nov 22 04:30:04 crc kubenswrapper[4699]: I1122 04:30:04.984600 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d03f6427-651c-4de6-851f-a2961d706e99","Type":"ContainerStarted","Data":"7b718a632f8c2933bd0b70f40b11df0e416eb21e6bd6ccc463cc705f917b169e"} Nov 22 04:30:04 crc kubenswrapper[4699]: I1122 04:30:04.984633 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d03f6427-651c-4de6-851f-a2961d706e99","Type":"ContainerStarted","Data":"cbc6c18c8ab51c74c196ee88bfa5b0099d42b8f3373297241d74a8059d98d955"} Nov 22 04:30:04 crc kubenswrapper[4699]: I1122 04:30:04.984645 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d03f6427-651c-4de6-851f-a2961d706e99","Type":"ContainerStarted","Data":"dfa062d51e1fefd65fefd7a83b4cde4961e1fed65f9bf0ebe674bd43b4f8bcd6"} Nov 22 04:30:04 crc kubenswrapper[4699]: I1122 04:30:04.986920 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396430-vmtpw" event={"ID":"ae586a43-829a-4c64-9384-cacf973a9c67","Type":"ContainerDied","Data":"94a19dfc52720d1208735db960ebb5a7c2bd418244f303bd239481a590995ac7"} Nov 22 04:30:04 crc kubenswrapper[4699]: I1122 04:30:04.986963 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94a19dfc52720d1208735db960ebb5a7c2bd418244f303bd239481a590995ac7" Nov 22 04:30:04 crc kubenswrapper[4699]: I1122 04:30:04.987025 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396430-vmtpw" Nov 22 04:30:05 crc kubenswrapper[4699]: I1122 04:30:05.008045 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=3.007955306 podStartE2EDuration="3.007955306s" podCreationTimestamp="2025-11-22 04:30:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:30:04.996045545 +0000 UTC m=+1356.338666732" watchObservedRunningTime="2025-11-22 04:30:05.007955306 +0000 UTC m=+1356.350576493" Nov 22 04:30:05 crc kubenswrapper[4699]: I1122 04:30:05.016749 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.016728363 podStartE2EDuration="2.016728363s" podCreationTimestamp="2025-11-22 04:30:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:30:05.012504533 +0000 UTC m=+1356.355125730" watchObservedRunningTime="2025-11-22 04:30:05.016728363 +0000 UTC m=+1356.359349550" Nov 22 04:30:05 crc kubenswrapper[4699]: I1122 04:30:05.571840 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 04:30:05 crc kubenswrapper[4699]: I1122 04:30:05.697704 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f08adc05-ca24-491e-bf3c-ea49d60d9ca8-combined-ca-bundle\") pod \"f08adc05-ca24-491e-bf3c-ea49d60d9ca8\" (UID: \"f08adc05-ca24-491e-bf3c-ea49d60d9ca8\") " Nov 22 04:30:05 crc kubenswrapper[4699]: I1122 04:30:05.698120 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thjfx\" (UniqueName: \"kubernetes.io/projected/f08adc05-ca24-491e-bf3c-ea49d60d9ca8-kube-api-access-thjfx\") pod \"f08adc05-ca24-491e-bf3c-ea49d60d9ca8\" (UID: \"f08adc05-ca24-491e-bf3c-ea49d60d9ca8\") " Nov 22 04:30:05 crc kubenswrapper[4699]: I1122 04:30:05.698188 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f08adc05-ca24-491e-bf3c-ea49d60d9ca8-config-data\") pod \"f08adc05-ca24-491e-bf3c-ea49d60d9ca8\" (UID: \"f08adc05-ca24-491e-bf3c-ea49d60d9ca8\") " Nov 22 04:30:05 crc kubenswrapper[4699]: I1122 04:30:05.713894 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f08adc05-ca24-491e-bf3c-ea49d60d9ca8-kube-api-access-thjfx" (OuterVolumeSpecName: "kube-api-access-thjfx") pod "f08adc05-ca24-491e-bf3c-ea49d60d9ca8" (UID: "f08adc05-ca24-491e-bf3c-ea49d60d9ca8"). InnerVolumeSpecName "kube-api-access-thjfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:30:05 crc kubenswrapper[4699]: I1122 04:30:05.737665 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f08adc05-ca24-491e-bf3c-ea49d60d9ca8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f08adc05-ca24-491e-bf3c-ea49d60d9ca8" (UID: "f08adc05-ca24-491e-bf3c-ea49d60d9ca8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:30:05 crc kubenswrapper[4699]: I1122 04:30:05.778796 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f08adc05-ca24-491e-bf3c-ea49d60d9ca8-config-data" (OuterVolumeSpecName: "config-data") pod "f08adc05-ca24-491e-bf3c-ea49d60d9ca8" (UID: "f08adc05-ca24-491e-bf3c-ea49d60d9ca8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:30:05 crc kubenswrapper[4699]: I1122 04:30:05.800616 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f08adc05-ca24-491e-bf3c-ea49d60d9ca8-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:05 crc kubenswrapper[4699]: I1122 04:30:05.800652 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f08adc05-ca24-491e-bf3c-ea49d60d9ca8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:05 crc kubenswrapper[4699]: I1122 04:30:05.800667 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thjfx\" (UniqueName: \"kubernetes.io/projected/f08adc05-ca24-491e-bf3c-ea49d60d9ca8-kube-api-access-thjfx\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:05 crc kubenswrapper[4699]: I1122 04:30:05.996543 4699 generic.go:334] "Generic (PLEG): container finished" podID="f08adc05-ca24-491e-bf3c-ea49d60d9ca8" containerID="f1e0782e4b7cbb15a275d8f80e9f0ef0b72d79f4ae89c424dfb567897ea3e295" exitCode=0 Nov 22 04:30:05 crc kubenswrapper[4699]: I1122 04:30:05.996604 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 04:30:05 crc kubenswrapper[4699]: I1122 04:30:05.996631 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f08adc05-ca24-491e-bf3c-ea49d60d9ca8","Type":"ContainerDied","Data":"f1e0782e4b7cbb15a275d8f80e9f0ef0b72d79f4ae89c424dfb567897ea3e295"} Nov 22 04:30:05 crc kubenswrapper[4699]: I1122 04:30:05.997753 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f08adc05-ca24-491e-bf3c-ea49d60d9ca8","Type":"ContainerDied","Data":"a4a32210bfd85fba344901abbb5e4107666f0818475921db3def910de56575ec"} Nov 22 04:30:05 crc kubenswrapper[4699]: I1122 04:30:05.997783 4699 scope.go:117] "RemoveContainer" containerID="f1e0782e4b7cbb15a275d8f80e9f0ef0b72d79f4ae89c424dfb567897ea3e295" Nov 22 04:30:06 crc kubenswrapper[4699]: I1122 04:30:06.025156 4699 scope.go:117] "RemoveContainer" containerID="f1e0782e4b7cbb15a275d8f80e9f0ef0b72d79f4ae89c424dfb567897ea3e295" Nov 22 04:30:06 crc kubenswrapper[4699]: E1122 04:30:06.025738 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1e0782e4b7cbb15a275d8f80e9f0ef0b72d79f4ae89c424dfb567897ea3e295\": container with ID starting with f1e0782e4b7cbb15a275d8f80e9f0ef0b72d79f4ae89c424dfb567897ea3e295 not found: ID does not exist" containerID="f1e0782e4b7cbb15a275d8f80e9f0ef0b72d79f4ae89c424dfb567897ea3e295" Nov 22 04:30:06 crc kubenswrapper[4699]: I1122 04:30:06.025789 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1e0782e4b7cbb15a275d8f80e9f0ef0b72d79f4ae89c424dfb567897ea3e295"} err="failed to get container status \"f1e0782e4b7cbb15a275d8f80e9f0ef0b72d79f4ae89c424dfb567897ea3e295\": rpc error: code = NotFound desc = could not find container \"f1e0782e4b7cbb15a275d8f80e9f0ef0b72d79f4ae89c424dfb567897ea3e295\": container with ID starting with f1e0782e4b7cbb15a275d8f80e9f0ef0b72d79f4ae89c424dfb567897ea3e295 not found: ID does not exist" Nov 22 04:30:06 crc kubenswrapper[4699]: I1122 04:30:06.033425 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 04:30:06 crc kubenswrapper[4699]: I1122 04:30:06.046662 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 04:30:06 crc kubenswrapper[4699]: I1122 04:30:06.096397 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 04:30:06 crc kubenswrapper[4699]: E1122 04:30:06.097570 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae586a43-829a-4c64-9384-cacf973a9c67" containerName="collect-profiles" Nov 22 04:30:06 crc kubenswrapper[4699]: I1122 04:30:06.097594 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae586a43-829a-4c64-9384-cacf973a9c67" containerName="collect-profiles" Nov 22 04:30:06 crc kubenswrapper[4699]: E1122 04:30:06.097647 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f08adc05-ca24-491e-bf3c-ea49d60d9ca8" containerName="nova-scheduler-scheduler" Nov 22 04:30:06 crc kubenswrapper[4699]: I1122 04:30:06.097654 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="f08adc05-ca24-491e-bf3c-ea49d60d9ca8" containerName="nova-scheduler-scheduler" Nov 22 04:30:06 crc kubenswrapper[4699]: I1122 04:30:06.097998 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="f08adc05-ca24-491e-bf3c-ea49d60d9ca8" containerName="nova-scheduler-scheduler" Nov 22 04:30:06 crc kubenswrapper[4699]: I1122 04:30:06.098029 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae586a43-829a-4c64-9384-cacf973a9c67" containerName="collect-profiles" Nov 22 04:30:06 crc kubenswrapper[4699]: I1122 04:30:06.099253 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 04:30:06 crc kubenswrapper[4699]: I1122 04:30:06.104056 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 22 04:30:06 crc kubenswrapper[4699]: I1122 04:30:06.120328 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 04:30:06 crc kubenswrapper[4699]: I1122 04:30:06.209828 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dc7d0f2-e54f-4255-99e0-276a543227b1-config-data\") pod \"nova-scheduler-0\" (UID: \"0dc7d0f2-e54f-4255-99e0-276a543227b1\") " pod="openstack/nova-scheduler-0" Nov 22 04:30:06 crc kubenswrapper[4699]: I1122 04:30:06.210177 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dc7d0f2-e54f-4255-99e0-276a543227b1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0dc7d0f2-e54f-4255-99e0-276a543227b1\") " pod="openstack/nova-scheduler-0" Nov 22 04:30:06 crc kubenswrapper[4699]: I1122 04:30:06.210416 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x84x5\" (UniqueName: \"kubernetes.io/projected/0dc7d0f2-e54f-4255-99e0-276a543227b1-kube-api-access-x84x5\") pod \"nova-scheduler-0\" (UID: \"0dc7d0f2-e54f-4255-99e0-276a543227b1\") " pod="openstack/nova-scheduler-0" Nov 22 04:30:06 crc kubenswrapper[4699]: I1122 04:30:06.312828 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dc7d0f2-e54f-4255-99e0-276a543227b1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0dc7d0f2-e54f-4255-99e0-276a543227b1\") " pod="openstack/nova-scheduler-0" Nov 22 04:30:06 crc kubenswrapper[4699]: I1122 04:30:06.313229 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x84x5\" (UniqueName: \"kubernetes.io/projected/0dc7d0f2-e54f-4255-99e0-276a543227b1-kube-api-access-x84x5\") pod \"nova-scheduler-0\" (UID: \"0dc7d0f2-e54f-4255-99e0-276a543227b1\") " pod="openstack/nova-scheduler-0" Nov 22 04:30:06 crc kubenswrapper[4699]: I1122 04:30:06.313304 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dc7d0f2-e54f-4255-99e0-276a543227b1-config-data\") pod \"nova-scheduler-0\" (UID: \"0dc7d0f2-e54f-4255-99e0-276a543227b1\") " pod="openstack/nova-scheduler-0" Nov 22 04:30:06 crc kubenswrapper[4699]: I1122 04:30:06.318945 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dc7d0f2-e54f-4255-99e0-276a543227b1-config-data\") pod \"nova-scheduler-0\" (UID: \"0dc7d0f2-e54f-4255-99e0-276a543227b1\") " pod="openstack/nova-scheduler-0" Nov 22 04:30:06 crc kubenswrapper[4699]: I1122 04:30:06.320176 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dc7d0f2-e54f-4255-99e0-276a543227b1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0dc7d0f2-e54f-4255-99e0-276a543227b1\") " pod="openstack/nova-scheduler-0" Nov 22 04:30:06 crc kubenswrapper[4699]: I1122 04:30:06.351112 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x84x5\" (UniqueName: \"kubernetes.io/projected/0dc7d0f2-e54f-4255-99e0-276a543227b1-kube-api-access-x84x5\") pod \"nova-scheduler-0\" (UID: \"0dc7d0f2-e54f-4255-99e0-276a543227b1\") " pod="openstack/nova-scheduler-0" Nov 22 04:30:06 crc kubenswrapper[4699]: I1122 04:30:06.421591 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 04:30:06 crc kubenswrapper[4699]: I1122 04:30:06.905611 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 04:30:07 crc kubenswrapper[4699]: I1122 04:30:07.011712 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0dc7d0f2-e54f-4255-99e0-276a543227b1","Type":"ContainerStarted","Data":"7d5240867b3476dec84a50b52728da5a86d07a17ac81b3673efd8caf5ad12db1"} Nov 22 04:30:07 crc kubenswrapper[4699]: I1122 04:30:07.461291 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f08adc05-ca24-491e-bf3c-ea49d60d9ca8" path="/var/lib/kubelet/pods/f08adc05-ca24-491e-bf3c-ea49d60d9ca8/volumes" Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.022063 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.022899 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0dc7d0f2-e54f-4255-99e0-276a543227b1","Type":"ContainerStarted","Data":"e949861032a043e60f37b205279b56bcbdbf22c8b8b76d88f3828d9b2a87815a"} Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.025117 4699 generic.go:334] "Generic (PLEG): container finished" podID="45f45400-db99-4fb5-b760-ac5c0f6083ff" containerID="28c4b4e224afb6e386fc28d1e5ab5aaeccfc503c99fe360191a73e644398c5d5" exitCode=0 Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.025161 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"45f45400-db99-4fb5-b760-ac5c0f6083ff","Type":"ContainerDied","Data":"28c4b4e224afb6e386fc28d1e5ab5aaeccfc503c99fe360191a73e644398c5d5"} Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.025180 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.025200 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"45f45400-db99-4fb5-b760-ac5c0f6083ff","Type":"ContainerDied","Data":"a783412b472c851b66d15220ffe164c28477ab367d4d99ef54a27d8c8a773b19"} Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.025225 4699 scope.go:117] "RemoveContainer" containerID="28c4b4e224afb6e386fc28d1e5ab5aaeccfc503c99fe360191a73e644398c5d5" Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.053989 4699 scope.go:117] "RemoveContainer" containerID="97f7f9f27e420661e41d2c446ead9e9fb95086e918d77bf31b6a9e96c3bdf8b9" Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.066135 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.0661099 podStartE2EDuration="2.0661099s" podCreationTimestamp="2025-11-22 04:30:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:30:08.055697584 +0000 UTC m=+1359.398318791" watchObservedRunningTime="2025-11-22 04:30:08.0661099 +0000 UTC m=+1359.408731107" Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.078611 4699 scope.go:117] "RemoveContainer" containerID="28c4b4e224afb6e386fc28d1e5ab5aaeccfc503c99fe360191a73e644398c5d5" Nov 22 04:30:08 crc kubenswrapper[4699]: E1122 04:30:08.083857 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28c4b4e224afb6e386fc28d1e5ab5aaeccfc503c99fe360191a73e644398c5d5\": container with ID starting with 28c4b4e224afb6e386fc28d1e5ab5aaeccfc503c99fe360191a73e644398c5d5 not found: ID does not exist" containerID="28c4b4e224afb6e386fc28d1e5ab5aaeccfc503c99fe360191a73e644398c5d5" Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.083926 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28c4b4e224afb6e386fc28d1e5ab5aaeccfc503c99fe360191a73e644398c5d5"} err="failed to get container status \"28c4b4e224afb6e386fc28d1e5ab5aaeccfc503c99fe360191a73e644398c5d5\": rpc error: code = NotFound desc = could not find container \"28c4b4e224afb6e386fc28d1e5ab5aaeccfc503c99fe360191a73e644398c5d5\": container with ID starting with 28c4b4e224afb6e386fc28d1e5ab5aaeccfc503c99fe360191a73e644398c5d5 not found: ID does not exist" Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.083961 4699 scope.go:117] "RemoveContainer" containerID="97f7f9f27e420661e41d2c446ead9e9fb95086e918d77bf31b6a9e96c3bdf8b9" Nov 22 04:30:08 crc kubenswrapper[4699]: E1122 04:30:08.087535 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97f7f9f27e420661e41d2c446ead9e9fb95086e918d77bf31b6a9e96c3bdf8b9\": container with ID starting with 97f7f9f27e420661e41d2c446ead9e9fb95086e918d77bf31b6a9e96c3bdf8b9 not found: ID does not exist" containerID="97f7f9f27e420661e41d2c446ead9e9fb95086e918d77bf31b6a9e96c3bdf8b9" Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.087576 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97f7f9f27e420661e41d2c446ead9e9fb95086e918d77bf31b6a9e96c3bdf8b9"} err="failed to get container status \"97f7f9f27e420661e41d2c446ead9e9fb95086e918d77bf31b6a9e96c3bdf8b9\": rpc error: code = NotFound desc = could not find container \"97f7f9f27e420661e41d2c446ead9e9fb95086e918d77bf31b6a9e96c3bdf8b9\": container with ID starting with 97f7f9f27e420661e41d2c446ead9e9fb95086e918d77bf31b6a9e96c3bdf8b9 not found: ID does not exist" Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.161855 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45f45400-db99-4fb5-b760-ac5c0f6083ff-combined-ca-bundle\") pod \"45f45400-db99-4fb5-b760-ac5c0f6083ff\" (UID: \"45f45400-db99-4fb5-b760-ac5c0f6083ff\") " Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.162088 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45f45400-db99-4fb5-b760-ac5c0f6083ff-logs\") pod \"45f45400-db99-4fb5-b760-ac5c0f6083ff\" (UID: \"45f45400-db99-4fb5-b760-ac5c0f6083ff\") " Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.162313 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf8n2\" (UniqueName: \"kubernetes.io/projected/45f45400-db99-4fb5-b760-ac5c0f6083ff-kube-api-access-bf8n2\") pod \"45f45400-db99-4fb5-b760-ac5c0f6083ff\" (UID: \"45f45400-db99-4fb5-b760-ac5c0f6083ff\") " Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.162504 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45f45400-db99-4fb5-b760-ac5c0f6083ff-config-data\") pod \"45f45400-db99-4fb5-b760-ac5c0f6083ff\" (UID: \"45f45400-db99-4fb5-b760-ac5c0f6083ff\") " Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.162665 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45f45400-db99-4fb5-b760-ac5c0f6083ff-logs" (OuterVolumeSpecName: "logs") pod "45f45400-db99-4fb5-b760-ac5c0f6083ff" (UID: "45f45400-db99-4fb5-b760-ac5c0f6083ff"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.167768 4699 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45f45400-db99-4fb5-b760-ac5c0f6083ff-logs\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.168919 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.175520 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45f45400-db99-4fb5-b760-ac5c0f6083ff-kube-api-access-bf8n2" (OuterVolumeSpecName: "kube-api-access-bf8n2") pod "45f45400-db99-4fb5-b760-ac5c0f6083ff" (UID: "45f45400-db99-4fb5-b760-ac5c0f6083ff"). InnerVolumeSpecName "kube-api-access-bf8n2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.207089 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45f45400-db99-4fb5-b760-ac5c0f6083ff-config-data" (OuterVolumeSpecName: "config-data") pod "45f45400-db99-4fb5-b760-ac5c0f6083ff" (UID: "45f45400-db99-4fb5-b760-ac5c0f6083ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.215654 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45f45400-db99-4fb5-b760-ac5c0f6083ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "45f45400-db99-4fb5-b760-ac5c0f6083ff" (UID: "45f45400-db99-4fb5-b760-ac5c0f6083ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.269623 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45f45400-db99-4fb5-b760-ac5c0f6083ff-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.269649 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45f45400-db99-4fb5-b760-ac5c0f6083ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.269660 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf8n2\" (UniqueName: \"kubernetes.io/projected/45f45400-db99-4fb5-b760-ac5c0f6083ff-kube-api-access-bf8n2\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.382570 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.410488 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.449964 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 22 04:30:08 crc kubenswrapper[4699]: E1122 04:30:08.450362 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45f45400-db99-4fb5-b760-ac5c0f6083ff" containerName="nova-api-api" Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.450376 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="45f45400-db99-4fb5-b760-ac5c0f6083ff" containerName="nova-api-api" Nov 22 04:30:08 crc kubenswrapper[4699]: E1122 04:30:08.450397 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45f45400-db99-4fb5-b760-ac5c0f6083ff" containerName="nova-api-log" Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.450403 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="45f45400-db99-4fb5-b760-ac5c0f6083ff" containerName="nova-api-log" Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.450590 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="45f45400-db99-4fb5-b760-ac5c0f6083ff" containerName="nova-api-log" Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.450624 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="45f45400-db99-4fb5-b760-ac5c0f6083ff" containerName="nova-api-api" Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.451656 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.456919 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.473383 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.473444 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.474059 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.577248 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7c4c472-2f99-4aac-933b-1f19965b8d06-config-data\") pod \"nova-api-0\" (UID: \"f7c4c472-2f99-4aac-933b-1f19965b8d06\") " pod="openstack/nova-api-0" Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.577491 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7c4c472-2f99-4aac-933b-1f19965b8d06-logs\") pod \"nova-api-0\" (UID: \"f7c4c472-2f99-4aac-933b-1f19965b8d06\") " pod="openstack/nova-api-0" Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.577752 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7c4c472-2f99-4aac-933b-1f19965b8d06-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f7c4c472-2f99-4aac-933b-1f19965b8d06\") " pod="openstack/nova-api-0" Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.577974 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmv7p\" (UniqueName: \"kubernetes.io/projected/f7c4c472-2f99-4aac-933b-1f19965b8d06-kube-api-access-pmv7p\") pod \"nova-api-0\" (UID: \"f7c4c472-2f99-4aac-933b-1f19965b8d06\") " pod="openstack/nova-api-0" Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.679779 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmv7p\" (UniqueName: \"kubernetes.io/projected/f7c4c472-2f99-4aac-933b-1f19965b8d06-kube-api-access-pmv7p\") pod \"nova-api-0\" (UID: \"f7c4c472-2f99-4aac-933b-1f19965b8d06\") " pod="openstack/nova-api-0" Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.680192 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7c4c472-2f99-4aac-933b-1f19965b8d06-config-data\") pod \"nova-api-0\" (UID: \"f7c4c472-2f99-4aac-933b-1f19965b8d06\") " pod="openstack/nova-api-0" Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.680355 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7c4c472-2f99-4aac-933b-1f19965b8d06-logs\") pod \"nova-api-0\" (UID: \"f7c4c472-2f99-4aac-933b-1f19965b8d06\") " pod="openstack/nova-api-0" Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.680569 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7c4c472-2f99-4aac-933b-1f19965b8d06-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f7c4c472-2f99-4aac-933b-1f19965b8d06\") " pod="openstack/nova-api-0" Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.680980 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7c4c472-2f99-4aac-933b-1f19965b8d06-logs\") pod \"nova-api-0\" (UID: \"f7c4c472-2f99-4aac-933b-1f19965b8d06\") " pod="openstack/nova-api-0" Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.684895 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7c4c472-2f99-4aac-933b-1f19965b8d06-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f7c4c472-2f99-4aac-933b-1f19965b8d06\") " pod="openstack/nova-api-0" Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.691806 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7c4c472-2f99-4aac-933b-1f19965b8d06-config-data\") pod \"nova-api-0\" (UID: \"f7c4c472-2f99-4aac-933b-1f19965b8d06\") " pod="openstack/nova-api-0" Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.707044 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmv7p\" (UniqueName: \"kubernetes.io/projected/f7c4c472-2f99-4aac-933b-1f19965b8d06-kube-api-access-pmv7p\") pod \"nova-api-0\" (UID: \"f7c4c472-2f99-4aac-933b-1f19965b8d06\") " pod="openstack/nova-api-0" Nov 22 04:30:08 crc kubenswrapper[4699]: I1122 04:30:08.785955 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 04:30:09 crc kubenswrapper[4699]: I1122 04:30:09.280425 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 04:30:09 crc kubenswrapper[4699]: I1122 04:30:09.463001 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45f45400-db99-4fb5-b760-ac5c0f6083ff" path="/var/lib/kubelet/pods/45f45400-db99-4fb5-b760-ac5c0f6083ff/volumes" Nov 22 04:30:10 crc kubenswrapper[4699]: I1122 04:30:10.058229 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f7c4c472-2f99-4aac-933b-1f19965b8d06","Type":"ContainerStarted","Data":"ddcc30edd8b92f8d4d3f3771b7d156629a4e759ddc820a7b2a89dc9249b0efa1"} Nov 22 04:30:10 crc kubenswrapper[4699]: I1122 04:30:10.058826 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f7c4c472-2f99-4aac-933b-1f19965b8d06","Type":"ContainerStarted","Data":"c1c8fb17b423d693ce533542775be48b1b86295e646bae93a5f77ba921d341b4"} Nov 22 04:30:10 crc kubenswrapper[4699]: I1122 04:30:10.059011 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f7c4c472-2f99-4aac-933b-1f19965b8d06","Type":"ContainerStarted","Data":"e9ee9e7d6a11b1bbf879ff53224a079734d73db7ea4b7d22fd4bad2bafb3d14f"} Nov 22 04:30:10 crc kubenswrapper[4699]: I1122 04:30:10.092129 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.092105672 podStartE2EDuration="2.092105672s" podCreationTimestamp="2025-11-22 04:30:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:30:10.086269434 +0000 UTC m=+1361.428890621" watchObservedRunningTime="2025-11-22 04:30:10.092105672 +0000 UTC m=+1361.434726859" Nov 22 04:30:11 crc kubenswrapper[4699]: I1122 04:30:11.422520 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 22 04:30:12 crc kubenswrapper[4699]: I1122 04:30:12.074179 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 04:30:12 crc kubenswrapper[4699]: I1122 04:30:12.074466 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="c970cf2e-16a0-42fe-ba32-ee217bd82db8" containerName="kube-state-metrics" containerID="cri-o://e06ad23a072d71a58d00055d7639c48a8adda994392923c8b2a5b9e564510a48" gracePeriod=30 Nov 22 04:30:12 crc kubenswrapper[4699]: I1122 04:30:12.561262 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 22 04:30:12 crc kubenswrapper[4699]: I1122 04:30:12.658240 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqfmk\" (UniqueName: \"kubernetes.io/projected/c970cf2e-16a0-42fe-ba32-ee217bd82db8-kube-api-access-kqfmk\") pod \"c970cf2e-16a0-42fe-ba32-ee217bd82db8\" (UID: \"c970cf2e-16a0-42fe-ba32-ee217bd82db8\") " Nov 22 04:30:12 crc kubenswrapper[4699]: I1122 04:30:12.664828 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c970cf2e-16a0-42fe-ba32-ee217bd82db8-kube-api-access-kqfmk" (OuterVolumeSpecName: "kube-api-access-kqfmk") pod "c970cf2e-16a0-42fe-ba32-ee217bd82db8" (UID: "c970cf2e-16a0-42fe-ba32-ee217bd82db8"). InnerVolumeSpecName "kube-api-access-kqfmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:30:12 crc kubenswrapper[4699]: I1122 04:30:12.761539 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqfmk\" (UniqueName: \"kubernetes.io/projected/c970cf2e-16a0-42fe-ba32-ee217bd82db8-kube-api-access-kqfmk\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:13 crc kubenswrapper[4699]: I1122 04:30:13.087211 4699 generic.go:334] "Generic (PLEG): container finished" podID="c970cf2e-16a0-42fe-ba32-ee217bd82db8" containerID="e06ad23a072d71a58d00055d7639c48a8adda994392923c8b2a5b9e564510a48" exitCode=2 Nov 22 04:30:13 crc kubenswrapper[4699]: I1122 04:30:13.087261 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c970cf2e-16a0-42fe-ba32-ee217bd82db8","Type":"ContainerDied","Data":"e06ad23a072d71a58d00055d7639c48a8adda994392923c8b2a5b9e564510a48"} Nov 22 04:30:13 crc kubenswrapper[4699]: I1122 04:30:13.087313 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"c970cf2e-16a0-42fe-ba32-ee217bd82db8","Type":"ContainerDied","Data":"474fca0310fc3d04a4748fc896f81028db9d710e1ca887517833c1044552ebe0"} Nov 22 04:30:13 crc kubenswrapper[4699]: I1122 04:30:13.087332 4699 scope.go:117] "RemoveContainer" containerID="e06ad23a072d71a58d00055d7639c48a8adda994392923c8b2a5b9e564510a48" Nov 22 04:30:13 crc kubenswrapper[4699]: I1122 04:30:13.087963 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 22 04:30:13 crc kubenswrapper[4699]: I1122 04:30:13.116869 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 04:30:13 crc kubenswrapper[4699]: I1122 04:30:13.124993 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 04:30:13 crc kubenswrapper[4699]: I1122 04:30:13.142115 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 04:30:13 crc kubenswrapper[4699]: E1122 04:30:13.142615 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c970cf2e-16a0-42fe-ba32-ee217bd82db8" containerName="kube-state-metrics" Nov 22 04:30:13 crc kubenswrapper[4699]: I1122 04:30:13.142632 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="c970cf2e-16a0-42fe-ba32-ee217bd82db8" containerName="kube-state-metrics" Nov 22 04:30:13 crc kubenswrapper[4699]: I1122 04:30:13.142855 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="c970cf2e-16a0-42fe-ba32-ee217bd82db8" containerName="kube-state-metrics" Nov 22 04:30:13 crc kubenswrapper[4699]: I1122 04:30:13.143654 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 22 04:30:13 crc kubenswrapper[4699]: I1122 04:30:13.146925 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Nov 22 04:30:13 crc kubenswrapper[4699]: I1122 04:30:13.146931 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Nov 22 04:30:13 crc kubenswrapper[4699]: I1122 04:30:13.155955 4699 scope.go:117] "RemoveContainer" containerID="e06ad23a072d71a58d00055d7639c48a8adda994392923c8b2a5b9e564510a48" Nov 22 04:30:13 crc kubenswrapper[4699]: E1122 04:30:13.159549 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e06ad23a072d71a58d00055d7639c48a8adda994392923c8b2a5b9e564510a48\": container with ID starting with e06ad23a072d71a58d00055d7639c48a8adda994392923c8b2a5b9e564510a48 not found: ID does not exist" containerID="e06ad23a072d71a58d00055d7639c48a8adda994392923c8b2a5b9e564510a48" Nov 22 04:30:13 crc kubenswrapper[4699]: I1122 04:30:13.159600 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e06ad23a072d71a58d00055d7639c48a8adda994392923c8b2a5b9e564510a48"} err="failed to get container status \"e06ad23a072d71a58d00055d7639c48a8adda994392923c8b2a5b9e564510a48\": rpc error: code = NotFound desc = could not find container \"e06ad23a072d71a58d00055d7639c48a8adda994392923c8b2a5b9e564510a48\": container with ID starting with e06ad23a072d71a58d00055d7639c48a8adda994392923c8b2a5b9e564510a48 not found: ID does not exist" Nov 22 04:30:13 crc kubenswrapper[4699]: I1122 04:30:13.175479 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 04:30:13 crc kubenswrapper[4699]: I1122 04:30:13.272426 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef3b350d-96dc-4b7f-bc63-586d92e57da6-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"ef3b350d-96dc-4b7f-bc63-586d92e57da6\") " pod="openstack/kube-state-metrics-0" Nov 22 04:30:13 crc kubenswrapper[4699]: I1122 04:30:13.272695 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t5jg\" (UniqueName: \"kubernetes.io/projected/ef3b350d-96dc-4b7f-bc63-586d92e57da6-kube-api-access-7t5jg\") pod \"kube-state-metrics-0\" (UID: \"ef3b350d-96dc-4b7f-bc63-586d92e57da6\") " pod="openstack/kube-state-metrics-0" Nov 22 04:30:13 crc kubenswrapper[4699]: I1122 04:30:13.272859 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef3b350d-96dc-4b7f-bc63-586d92e57da6-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"ef3b350d-96dc-4b7f-bc63-586d92e57da6\") " pod="openstack/kube-state-metrics-0" Nov 22 04:30:13 crc kubenswrapper[4699]: I1122 04:30:13.272927 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ef3b350d-96dc-4b7f-bc63-586d92e57da6-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"ef3b350d-96dc-4b7f-bc63-586d92e57da6\") " pod="openstack/kube-state-metrics-0" Nov 22 04:30:13 crc kubenswrapper[4699]: I1122 04:30:13.375216 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t5jg\" (UniqueName: \"kubernetes.io/projected/ef3b350d-96dc-4b7f-bc63-586d92e57da6-kube-api-access-7t5jg\") pod \"kube-state-metrics-0\" (UID: \"ef3b350d-96dc-4b7f-bc63-586d92e57da6\") " pod="openstack/kube-state-metrics-0" Nov 22 04:30:13 crc kubenswrapper[4699]: I1122 04:30:13.375327 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef3b350d-96dc-4b7f-bc63-586d92e57da6-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"ef3b350d-96dc-4b7f-bc63-586d92e57da6\") " pod="openstack/kube-state-metrics-0" Nov 22 04:30:13 crc kubenswrapper[4699]: I1122 04:30:13.375350 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ef3b350d-96dc-4b7f-bc63-586d92e57da6-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"ef3b350d-96dc-4b7f-bc63-586d92e57da6\") " pod="openstack/kube-state-metrics-0" Nov 22 04:30:13 crc kubenswrapper[4699]: I1122 04:30:13.375405 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef3b350d-96dc-4b7f-bc63-586d92e57da6-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"ef3b350d-96dc-4b7f-bc63-586d92e57da6\") " pod="openstack/kube-state-metrics-0" Nov 22 04:30:13 crc kubenswrapper[4699]: I1122 04:30:13.381831 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ef3b350d-96dc-4b7f-bc63-586d92e57da6-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"ef3b350d-96dc-4b7f-bc63-586d92e57da6\") " pod="openstack/kube-state-metrics-0" Nov 22 04:30:13 crc kubenswrapper[4699]: I1122 04:30:13.381928 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef3b350d-96dc-4b7f-bc63-586d92e57da6-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"ef3b350d-96dc-4b7f-bc63-586d92e57da6\") " pod="openstack/kube-state-metrics-0" Nov 22 04:30:13 crc kubenswrapper[4699]: I1122 04:30:13.392513 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t5jg\" (UniqueName: \"kubernetes.io/projected/ef3b350d-96dc-4b7f-bc63-586d92e57da6-kube-api-access-7t5jg\") pod \"kube-state-metrics-0\" (UID: \"ef3b350d-96dc-4b7f-bc63-586d92e57da6\") " pod="openstack/kube-state-metrics-0" Nov 22 04:30:13 crc kubenswrapper[4699]: I1122 04:30:13.393227 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef3b350d-96dc-4b7f-bc63-586d92e57da6-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"ef3b350d-96dc-4b7f-bc63-586d92e57da6\") " pod="openstack/kube-state-metrics-0" Nov 22 04:30:13 crc kubenswrapper[4699]: I1122 04:30:13.438353 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Nov 22 04:30:13 crc kubenswrapper[4699]: I1122 04:30:13.465927 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c970cf2e-16a0-42fe-ba32-ee217bd82db8" path="/var/lib/kubelet/pods/c970cf2e-16a0-42fe-ba32-ee217bd82db8/volumes" Nov 22 04:30:13 crc kubenswrapper[4699]: I1122 04:30:13.466926 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 22 04:30:13 crc kubenswrapper[4699]: I1122 04:30:13.466963 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 22 04:30:13 crc kubenswrapper[4699]: I1122 04:30:13.470098 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 22 04:30:13 crc kubenswrapper[4699]: I1122 04:30:13.965422 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 04:30:14 crc kubenswrapper[4699]: I1122 04:30:14.078777 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 04:30:14 crc kubenswrapper[4699]: I1122 04:30:14.079067 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6b8dfd21-2d6c-4e46-ac51-75888ce472c4" containerName="ceilometer-central-agent" containerID="cri-o://8a1d3ff60c0447fa81bd5fd576b08ecf3cb01f27126b575e4c457e96789d875c" gracePeriod=30 Nov 22 04:30:14 crc kubenswrapper[4699]: I1122 04:30:14.079110 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6b8dfd21-2d6c-4e46-ac51-75888ce472c4" containerName="sg-core" containerID="cri-o://624b3ed34e207030740b442941eaafeb37bee8c2c7bd7356ea9b503fff6630ee" gracePeriod=30 Nov 22 04:30:14 crc kubenswrapper[4699]: I1122 04:30:14.079200 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6b8dfd21-2d6c-4e46-ac51-75888ce472c4" containerName="proxy-httpd" containerID="cri-o://9c53850d7831be45a1256cd05431c371d93b9830c5944b7812b3509fe972186a" gracePeriod=30 Nov 22 04:30:14 crc kubenswrapper[4699]: I1122 04:30:14.079627 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6b8dfd21-2d6c-4e46-ac51-75888ce472c4" containerName="ceilometer-notification-agent" containerID="cri-o://7f37a5a81082c33bf0967b5e8816f8f47b8e9468a1bd24edf880dcc68fd69b8b" gracePeriod=30 Nov 22 04:30:14 crc kubenswrapper[4699]: I1122 04:30:14.100352 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ef3b350d-96dc-4b7f-bc63-586d92e57da6","Type":"ContainerStarted","Data":"1674bcdc106986fe41f8f35c1f80e05060053013e0741a40638112856564350e"} Nov 22 04:30:14 crc kubenswrapper[4699]: I1122 04:30:14.489739 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d03f6427-651c-4de6-851f-a2961d706e99" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 22 04:30:14 crc kubenswrapper[4699]: I1122 04:30:14.489796 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d03f6427-651c-4de6-851f-a2961d706e99" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 22 04:30:15 crc kubenswrapper[4699]: I1122 04:30:15.110051 4699 generic.go:334] "Generic (PLEG): container finished" podID="6b8dfd21-2d6c-4e46-ac51-75888ce472c4" containerID="9c53850d7831be45a1256cd05431c371d93b9830c5944b7812b3509fe972186a" exitCode=0 Nov 22 04:30:15 crc kubenswrapper[4699]: I1122 04:30:15.110083 4699 generic.go:334] "Generic (PLEG): container finished" podID="6b8dfd21-2d6c-4e46-ac51-75888ce472c4" containerID="624b3ed34e207030740b442941eaafeb37bee8c2c7bd7356ea9b503fff6630ee" exitCode=2 Nov 22 04:30:15 crc kubenswrapper[4699]: I1122 04:30:15.110090 4699 generic.go:334] "Generic (PLEG): container finished" podID="6b8dfd21-2d6c-4e46-ac51-75888ce472c4" containerID="8a1d3ff60c0447fa81bd5fd576b08ecf3cb01f27126b575e4c457e96789d875c" exitCode=0 Nov 22 04:30:15 crc kubenswrapper[4699]: I1122 04:30:15.110109 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b8dfd21-2d6c-4e46-ac51-75888ce472c4","Type":"ContainerDied","Data":"9c53850d7831be45a1256cd05431c371d93b9830c5944b7812b3509fe972186a"} Nov 22 04:30:15 crc kubenswrapper[4699]: I1122 04:30:15.110134 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b8dfd21-2d6c-4e46-ac51-75888ce472c4","Type":"ContainerDied","Data":"624b3ed34e207030740b442941eaafeb37bee8c2c7bd7356ea9b503fff6630ee"} Nov 22 04:30:15 crc kubenswrapper[4699]: I1122 04:30:15.110145 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b8dfd21-2d6c-4e46-ac51-75888ce472c4","Type":"ContainerDied","Data":"8a1d3ff60c0447fa81bd5fd576b08ecf3cb01f27126b575e4c457e96789d875c"} Nov 22 04:30:16 crc kubenswrapper[4699]: I1122 04:30:16.121912 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ef3b350d-96dc-4b7f-bc63-586d92e57da6","Type":"ContainerStarted","Data":"9c301f78baae292b480b066c50c58e3be8eeacadffac56ea90017a9953188624"} Nov 22 04:30:16 crc kubenswrapper[4699]: I1122 04:30:16.122067 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Nov 22 04:30:16 crc kubenswrapper[4699]: I1122 04:30:16.139916 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.123603185 podStartE2EDuration="3.139894603s" podCreationTimestamp="2025-11-22 04:30:13 +0000 UTC" firstStartedPulling="2025-11-22 04:30:13.988680147 +0000 UTC m=+1365.331301334" lastFinishedPulling="2025-11-22 04:30:15.004971565 +0000 UTC m=+1366.347592752" observedRunningTime="2025-11-22 04:30:16.137547397 +0000 UTC m=+1367.480168604" watchObservedRunningTime="2025-11-22 04:30:16.139894603 +0000 UTC m=+1367.482515790" Nov 22 04:30:16 crc kubenswrapper[4699]: I1122 04:30:16.422380 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 22 04:30:16 crc kubenswrapper[4699]: I1122 04:30:16.455177 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 22 04:30:17 crc kubenswrapper[4699]: I1122 04:30:17.161638 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 22 04:30:17 crc kubenswrapper[4699]: I1122 04:30:17.634753 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 04:30:17 crc kubenswrapper[4699]: I1122 04:30:17.667896 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b8dfd21-2d6c-4e46-ac51-75888ce472c4-combined-ca-bundle\") pod \"6b8dfd21-2d6c-4e46-ac51-75888ce472c4\" (UID: \"6b8dfd21-2d6c-4e46-ac51-75888ce472c4\") " Nov 22 04:30:17 crc kubenswrapper[4699]: I1122 04:30:17.667964 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k575c\" (UniqueName: \"kubernetes.io/projected/6b8dfd21-2d6c-4e46-ac51-75888ce472c4-kube-api-access-k575c\") pod \"6b8dfd21-2d6c-4e46-ac51-75888ce472c4\" (UID: \"6b8dfd21-2d6c-4e46-ac51-75888ce472c4\") " Nov 22 04:30:17 crc kubenswrapper[4699]: I1122 04:30:17.668003 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b8dfd21-2d6c-4e46-ac51-75888ce472c4-log-httpd\") pod \"6b8dfd21-2d6c-4e46-ac51-75888ce472c4\" (UID: \"6b8dfd21-2d6c-4e46-ac51-75888ce472c4\") " Nov 22 04:30:17 crc kubenswrapper[4699]: I1122 04:30:17.668031 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b8dfd21-2d6c-4e46-ac51-75888ce472c4-config-data\") pod \"6b8dfd21-2d6c-4e46-ac51-75888ce472c4\" (UID: \"6b8dfd21-2d6c-4e46-ac51-75888ce472c4\") " Nov 22 04:30:17 crc kubenswrapper[4699]: I1122 04:30:17.668400 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b8dfd21-2d6c-4e46-ac51-75888ce472c4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6b8dfd21-2d6c-4e46-ac51-75888ce472c4" (UID: "6b8dfd21-2d6c-4e46-ac51-75888ce472c4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:30:17 crc kubenswrapper[4699]: I1122 04:30:17.668540 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b8dfd21-2d6c-4e46-ac51-75888ce472c4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6b8dfd21-2d6c-4e46-ac51-75888ce472c4" (UID: "6b8dfd21-2d6c-4e46-ac51-75888ce472c4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:30:17 crc kubenswrapper[4699]: I1122 04:30:17.668614 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b8dfd21-2d6c-4e46-ac51-75888ce472c4-run-httpd\") pod \"6b8dfd21-2d6c-4e46-ac51-75888ce472c4\" (UID: \"6b8dfd21-2d6c-4e46-ac51-75888ce472c4\") " Nov 22 04:30:17 crc kubenswrapper[4699]: I1122 04:30:17.668728 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b8dfd21-2d6c-4e46-ac51-75888ce472c4-scripts\") pod \"6b8dfd21-2d6c-4e46-ac51-75888ce472c4\" (UID: \"6b8dfd21-2d6c-4e46-ac51-75888ce472c4\") " Nov 22 04:30:17 crc kubenswrapper[4699]: I1122 04:30:17.668856 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6b8dfd21-2d6c-4e46-ac51-75888ce472c4-sg-core-conf-yaml\") pod \"6b8dfd21-2d6c-4e46-ac51-75888ce472c4\" (UID: \"6b8dfd21-2d6c-4e46-ac51-75888ce472c4\") " Nov 22 04:30:17 crc kubenswrapper[4699]: I1122 04:30:17.680542 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b8dfd21-2d6c-4e46-ac51-75888ce472c4-scripts" (OuterVolumeSpecName: "scripts") pod "6b8dfd21-2d6c-4e46-ac51-75888ce472c4" (UID: "6b8dfd21-2d6c-4e46-ac51-75888ce472c4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:30:17 crc kubenswrapper[4699]: I1122 04:30:17.688348 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b8dfd21-2d6c-4e46-ac51-75888ce472c4-kube-api-access-k575c" (OuterVolumeSpecName: "kube-api-access-k575c") pod "6b8dfd21-2d6c-4e46-ac51-75888ce472c4" (UID: "6b8dfd21-2d6c-4e46-ac51-75888ce472c4"). InnerVolumeSpecName "kube-api-access-k575c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:30:17 crc kubenswrapper[4699]: I1122 04:30:17.712553 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b8dfd21-2d6c-4e46-ac51-75888ce472c4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6b8dfd21-2d6c-4e46-ac51-75888ce472c4" (UID: "6b8dfd21-2d6c-4e46-ac51-75888ce472c4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:30:17 crc kubenswrapper[4699]: I1122 04:30:17.766950 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b8dfd21-2d6c-4e46-ac51-75888ce472c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b8dfd21-2d6c-4e46-ac51-75888ce472c4" (UID: "6b8dfd21-2d6c-4e46-ac51-75888ce472c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:30:17 crc kubenswrapper[4699]: I1122 04:30:17.771706 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k575c\" (UniqueName: \"kubernetes.io/projected/6b8dfd21-2d6c-4e46-ac51-75888ce472c4-kube-api-access-k575c\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:17 crc kubenswrapper[4699]: I1122 04:30:17.771739 4699 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b8dfd21-2d6c-4e46-ac51-75888ce472c4-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:17 crc kubenswrapper[4699]: I1122 04:30:17.771749 4699 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6b8dfd21-2d6c-4e46-ac51-75888ce472c4-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:17 crc kubenswrapper[4699]: I1122 04:30:17.771759 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b8dfd21-2d6c-4e46-ac51-75888ce472c4-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:17 crc kubenswrapper[4699]: I1122 04:30:17.771770 4699 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6b8dfd21-2d6c-4e46-ac51-75888ce472c4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:17 crc kubenswrapper[4699]: I1122 04:30:17.771781 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b8dfd21-2d6c-4e46-ac51-75888ce472c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:17 crc kubenswrapper[4699]: I1122 04:30:17.815150 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b8dfd21-2d6c-4e46-ac51-75888ce472c4-config-data" (OuterVolumeSpecName: "config-data") pod "6b8dfd21-2d6c-4e46-ac51-75888ce472c4" (UID: "6b8dfd21-2d6c-4e46-ac51-75888ce472c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:30:17 crc kubenswrapper[4699]: I1122 04:30:17.873149 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b8dfd21-2d6c-4e46-ac51-75888ce472c4-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.140637 4699 generic.go:334] "Generic (PLEG): container finished" podID="6b8dfd21-2d6c-4e46-ac51-75888ce472c4" containerID="7f37a5a81082c33bf0967b5e8816f8f47b8e9468a1bd24edf880dcc68fd69b8b" exitCode=0 Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.140730 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b8dfd21-2d6c-4e46-ac51-75888ce472c4","Type":"ContainerDied","Data":"7f37a5a81082c33bf0967b5e8816f8f47b8e9468a1bd24edf880dcc68fd69b8b"} Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.140782 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6b8dfd21-2d6c-4e46-ac51-75888ce472c4","Type":"ContainerDied","Data":"fba07e7dc8810cbcd10e718beb31e93e165755913086d5baa9b1c2c0256a0f93"} Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.140805 4699 scope.go:117] "RemoveContainer" containerID="9c53850d7831be45a1256cd05431c371d93b9830c5944b7812b3509fe972186a" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.140741 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.169957 4699 scope.go:117] "RemoveContainer" containerID="624b3ed34e207030740b442941eaafeb37bee8c2c7bd7356ea9b503fff6630ee" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.182083 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.198717 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.211871 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 22 04:30:18 crc kubenswrapper[4699]: E1122 04:30:18.212520 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b8dfd21-2d6c-4e46-ac51-75888ce472c4" containerName="sg-core" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.212589 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b8dfd21-2d6c-4e46-ac51-75888ce472c4" containerName="sg-core" Nov 22 04:30:18 crc kubenswrapper[4699]: E1122 04:30:18.212690 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b8dfd21-2d6c-4e46-ac51-75888ce472c4" containerName="ceilometer-notification-agent" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.212744 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b8dfd21-2d6c-4e46-ac51-75888ce472c4" containerName="ceilometer-notification-agent" Nov 22 04:30:18 crc kubenswrapper[4699]: E1122 04:30:18.212813 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b8dfd21-2d6c-4e46-ac51-75888ce472c4" containerName="proxy-httpd" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.212865 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b8dfd21-2d6c-4e46-ac51-75888ce472c4" containerName="proxy-httpd" Nov 22 04:30:18 crc kubenswrapper[4699]: E1122 04:30:18.212935 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b8dfd21-2d6c-4e46-ac51-75888ce472c4" containerName="ceilometer-central-agent" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.212986 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b8dfd21-2d6c-4e46-ac51-75888ce472c4" containerName="ceilometer-central-agent" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.213229 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b8dfd21-2d6c-4e46-ac51-75888ce472c4" containerName="sg-core" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.213299 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b8dfd21-2d6c-4e46-ac51-75888ce472c4" containerName="ceilometer-central-agent" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.213363 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b8dfd21-2d6c-4e46-ac51-75888ce472c4" containerName="ceilometer-notification-agent" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.213419 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b8dfd21-2d6c-4e46-ac51-75888ce472c4" containerName="proxy-httpd" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.217898 4699 scope.go:117] "RemoveContainer" containerID="7f37a5a81082c33bf0967b5e8816f8f47b8e9468a1bd24edf880dcc68fd69b8b" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.218582 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.227113 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.227330 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.227366 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.234158 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.280762 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a77dcc1-88a0-4a7d-9e71-aa9a89178683-config-data\") pod \"ceilometer-0\" (UID: \"3a77dcc1-88a0-4a7d-9e71-aa9a89178683\") " pod="openstack/ceilometer-0" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.280950 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a77dcc1-88a0-4a7d-9e71-aa9a89178683-scripts\") pod \"ceilometer-0\" (UID: \"3a77dcc1-88a0-4a7d-9e71-aa9a89178683\") " pod="openstack/ceilometer-0" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.281224 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3a77dcc1-88a0-4a7d-9e71-aa9a89178683-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3a77dcc1-88a0-4a7d-9e71-aa9a89178683\") " pod="openstack/ceilometer-0" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.281291 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a77dcc1-88a0-4a7d-9e71-aa9a89178683-run-httpd\") pod \"ceilometer-0\" (UID: \"3a77dcc1-88a0-4a7d-9e71-aa9a89178683\") " pod="openstack/ceilometer-0" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.281331 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a77dcc1-88a0-4a7d-9e71-aa9a89178683-log-httpd\") pod \"ceilometer-0\" (UID: \"3a77dcc1-88a0-4a7d-9e71-aa9a89178683\") " pod="openstack/ceilometer-0" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.281551 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a77dcc1-88a0-4a7d-9e71-aa9a89178683-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3a77dcc1-88a0-4a7d-9e71-aa9a89178683\") " pod="openstack/ceilometer-0" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.281783 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a77dcc1-88a0-4a7d-9e71-aa9a89178683-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3a77dcc1-88a0-4a7d-9e71-aa9a89178683\") " pod="openstack/ceilometer-0" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.281868 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfj4x\" (UniqueName: \"kubernetes.io/projected/3a77dcc1-88a0-4a7d-9e71-aa9a89178683-kube-api-access-bfj4x\") pod \"ceilometer-0\" (UID: \"3a77dcc1-88a0-4a7d-9e71-aa9a89178683\") " pod="openstack/ceilometer-0" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.301879 4699 scope.go:117] "RemoveContainer" containerID="8a1d3ff60c0447fa81bd5fd576b08ecf3cb01f27126b575e4c457e96789d875c" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.323474 4699 scope.go:117] "RemoveContainer" containerID="9c53850d7831be45a1256cd05431c371d93b9830c5944b7812b3509fe972186a" Nov 22 04:30:18 crc kubenswrapper[4699]: E1122 04:30:18.324105 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c53850d7831be45a1256cd05431c371d93b9830c5944b7812b3509fe972186a\": container with ID starting with 9c53850d7831be45a1256cd05431c371d93b9830c5944b7812b3509fe972186a not found: ID does not exist" containerID="9c53850d7831be45a1256cd05431c371d93b9830c5944b7812b3509fe972186a" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.324151 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c53850d7831be45a1256cd05431c371d93b9830c5944b7812b3509fe972186a"} err="failed to get container status \"9c53850d7831be45a1256cd05431c371d93b9830c5944b7812b3509fe972186a\": rpc error: code = NotFound desc = could not find container \"9c53850d7831be45a1256cd05431c371d93b9830c5944b7812b3509fe972186a\": container with ID starting with 9c53850d7831be45a1256cd05431c371d93b9830c5944b7812b3509fe972186a not found: ID does not exist" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.324178 4699 scope.go:117] "RemoveContainer" containerID="624b3ed34e207030740b442941eaafeb37bee8c2c7bd7356ea9b503fff6630ee" Nov 22 04:30:18 crc kubenswrapper[4699]: E1122 04:30:18.325056 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"624b3ed34e207030740b442941eaafeb37bee8c2c7bd7356ea9b503fff6630ee\": container with ID starting with 624b3ed34e207030740b442941eaafeb37bee8c2c7bd7356ea9b503fff6630ee not found: ID does not exist" containerID="624b3ed34e207030740b442941eaafeb37bee8c2c7bd7356ea9b503fff6630ee" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.325082 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"624b3ed34e207030740b442941eaafeb37bee8c2c7bd7356ea9b503fff6630ee"} err="failed to get container status \"624b3ed34e207030740b442941eaafeb37bee8c2c7bd7356ea9b503fff6630ee\": rpc error: code = NotFound desc = could not find container \"624b3ed34e207030740b442941eaafeb37bee8c2c7bd7356ea9b503fff6630ee\": container with ID starting with 624b3ed34e207030740b442941eaafeb37bee8c2c7bd7356ea9b503fff6630ee not found: ID does not exist" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.325099 4699 scope.go:117] "RemoveContainer" containerID="7f37a5a81082c33bf0967b5e8816f8f47b8e9468a1bd24edf880dcc68fd69b8b" Nov 22 04:30:18 crc kubenswrapper[4699]: E1122 04:30:18.325352 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f37a5a81082c33bf0967b5e8816f8f47b8e9468a1bd24edf880dcc68fd69b8b\": container with ID starting with 7f37a5a81082c33bf0967b5e8816f8f47b8e9468a1bd24edf880dcc68fd69b8b not found: ID does not exist" containerID="7f37a5a81082c33bf0967b5e8816f8f47b8e9468a1bd24edf880dcc68fd69b8b" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.325377 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f37a5a81082c33bf0967b5e8816f8f47b8e9468a1bd24edf880dcc68fd69b8b"} err="failed to get container status \"7f37a5a81082c33bf0967b5e8816f8f47b8e9468a1bd24edf880dcc68fd69b8b\": rpc error: code = NotFound desc = could not find container \"7f37a5a81082c33bf0967b5e8816f8f47b8e9468a1bd24edf880dcc68fd69b8b\": container with ID starting with 7f37a5a81082c33bf0967b5e8816f8f47b8e9468a1bd24edf880dcc68fd69b8b not found: ID does not exist" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.325391 4699 scope.go:117] "RemoveContainer" containerID="8a1d3ff60c0447fa81bd5fd576b08ecf3cb01f27126b575e4c457e96789d875c" Nov 22 04:30:18 crc kubenswrapper[4699]: E1122 04:30:18.325921 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a1d3ff60c0447fa81bd5fd576b08ecf3cb01f27126b575e4c457e96789d875c\": container with ID starting with 8a1d3ff60c0447fa81bd5fd576b08ecf3cb01f27126b575e4c457e96789d875c not found: ID does not exist" containerID="8a1d3ff60c0447fa81bd5fd576b08ecf3cb01f27126b575e4c457e96789d875c" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.325946 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a1d3ff60c0447fa81bd5fd576b08ecf3cb01f27126b575e4c457e96789d875c"} err="failed to get container status \"8a1d3ff60c0447fa81bd5fd576b08ecf3cb01f27126b575e4c457e96789d875c\": rpc error: code = NotFound desc = could not find container \"8a1d3ff60c0447fa81bd5fd576b08ecf3cb01f27126b575e4c457e96789d875c\": container with ID starting with 8a1d3ff60c0447fa81bd5fd576b08ecf3cb01f27126b575e4c457e96789d875c not found: ID does not exist" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.384121 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a77dcc1-88a0-4a7d-9e71-aa9a89178683-config-data\") pod \"ceilometer-0\" (UID: \"3a77dcc1-88a0-4a7d-9e71-aa9a89178683\") " pod="openstack/ceilometer-0" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.384251 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a77dcc1-88a0-4a7d-9e71-aa9a89178683-scripts\") pod \"ceilometer-0\" (UID: \"3a77dcc1-88a0-4a7d-9e71-aa9a89178683\") " pod="openstack/ceilometer-0" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.384307 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3a77dcc1-88a0-4a7d-9e71-aa9a89178683-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3a77dcc1-88a0-4a7d-9e71-aa9a89178683\") " pod="openstack/ceilometer-0" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.384360 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a77dcc1-88a0-4a7d-9e71-aa9a89178683-run-httpd\") pod \"ceilometer-0\" (UID: \"3a77dcc1-88a0-4a7d-9e71-aa9a89178683\") " pod="openstack/ceilometer-0" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.384394 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a77dcc1-88a0-4a7d-9e71-aa9a89178683-log-httpd\") pod \"ceilometer-0\" (UID: \"3a77dcc1-88a0-4a7d-9e71-aa9a89178683\") " pod="openstack/ceilometer-0" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.384530 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a77dcc1-88a0-4a7d-9e71-aa9a89178683-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3a77dcc1-88a0-4a7d-9e71-aa9a89178683\") " pod="openstack/ceilometer-0" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.384815 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a77dcc1-88a0-4a7d-9e71-aa9a89178683-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3a77dcc1-88a0-4a7d-9e71-aa9a89178683\") " pod="openstack/ceilometer-0" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.384950 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfj4x\" (UniqueName: \"kubernetes.io/projected/3a77dcc1-88a0-4a7d-9e71-aa9a89178683-kube-api-access-bfj4x\") pod \"ceilometer-0\" (UID: \"3a77dcc1-88a0-4a7d-9e71-aa9a89178683\") " pod="openstack/ceilometer-0" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.384994 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a77dcc1-88a0-4a7d-9e71-aa9a89178683-run-httpd\") pod \"ceilometer-0\" (UID: \"3a77dcc1-88a0-4a7d-9e71-aa9a89178683\") " pod="openstack/ceilometer-0" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.385050 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a77dcc1-88a0-4a7d-9e71-aa9a89178683-log-httpd\") pod \"ceilometer-0\" (UID: \"3a77dcc1-88a0-4a7d-9e71-aa9a89178683\") " pod="openstack/ceilometer-0" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.391356 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a77dcc1-88a0-4a7d-9e71-aa9a89178683-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3a77dcc1-88a0-4a7d-9e71-aa9a89178683\") " pod="openstack/ceilometer-0" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.391566 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a77dcc1-88a0-4a7d-9e71-aa9a89178683-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3a77dcc1-88a0-4a7d-9e71-aa9a89178683\") " pod="openstack/ceilometer-0" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.392032 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3a77dcc1-88a0-4a7d-9e71-aa9a89178683-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3a77dcc1-88a0-4a7d-9e71-aa9a89178683\") " pod="openstack/ceilometer-0" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.393590 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a77dcc1-88a0-4a7d-9e71-aa9a89178683-scripts\") pod \"ceilometer-0\" (UID: \"3a77dcc1-88a0-4a7d-9e71-aa9a89178683\") " pod="openstack/ceilometer-0" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.393856 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a77dcc1-88a0-4a7d-9e71-aa9a89178683-config-data\") pod \"ceilometer-0\" (UID: \"3a77dcc1-88a0-4a7d-9e71-aa9a89178683\") " pod="openstack/ceilometer-0" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.407350 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfj4x\" (UniqueName: \"kubernetes.io/projected/3a77dcc1-88a0-4a7d-9e71-aa9a89178683-kube-api-access-bfj4x\") pod \"ceilometer-0\" (UID: \"3a77dcc1-88a0-4a7d-9e71-aa9a89178683\") " pod="openstack/ceilometer-0" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.592535 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.786956 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 22 04:30:18 crc kubenswrapper[4699]: I1122 04:30:18.787402 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 22 04:30:19 crc kubenswrapper[4699]: I1122 04:30:19.051687 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 04:30:19 crc kubenswrapper[4699]: I1122 04:30:19.152881 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a77dcc1-88a0-4a7d-9e71-aa9a89178683","Type":"ContainerStarted","Data":"57d9637b2d292b9e4a4497615cb4946cd4f9ae8644224435b49d3086827bdafa"} Nov 22 04:30:19 crc kubenswrapper[4699]: I1122 04:30:19.469848 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b8dfd21-2d6c-4e46-ac51-75888ce472c4" path="/var/lib/kubelet/pods/6b8dfd21-2d6c-4e46-ac51-75888ce472c4/volumes" Nov 22 04:30:19 crc kubenswrapper[4699]: I1122 04:30:19.869576 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f7c4c472-2f99-4aac-933b-1f19965b8d06" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.204:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 22 04:30:19 crc kubenswrapper[4699]: I1122 04:30:19.869641 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f7c4c472-2f99-4aac-933b-1f19965b8d06" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.204:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 22 04:30:20 crc kubenswrapper[4699]: I1122 04:30:20.172053 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a77dcc1-88a0-4a7d-9e71-aa9a89178683","Type":"ContainerStarted","Data":"f82d5a858bb1acf2e79549e885d7e7583ac74bc14956db056d35d16a4bac7f11"} Nov 22 04:30:23 crc kubenswrapper[4699]: I1122 04:30:23.204674 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a77dcc1-88a0-4a7d-9e71-aa9a89178683","Type":"ContainerStarted","Data":"a0ce3e21fbd0c60abcb1117fd7fd6e2b4cb491b542db1273d4401c6b424f8827"} Nov 22 04:30:23 crc kubenswrapper[4699]: I1122 04:30:23.472967 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 22 04:30:23 crc kubenswrapper[4699]: I1122 04:30:23.488972 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 22 04:30:23 crc kubenswrapper[4699]: I1122 04:30:23.489839 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Nov 22 04:30:23 crc kubenswrapper[4699]: I1122 04:30:23.494064 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 22 04:30:24 crc kubenswrapper[4699]: I1122 04:30:24.227600 4699 generic.go:334] "Generic (PLEG): container finished" podID="6b0a42c8-e8a1-45b3-9f29-77459d98ea4d" containerID="a61d8c7fa94c040b3d1169b2c17f00afe5d612c8291396061cb0c712611ae221" exitCode=0 Nov 22 04:30:24 crc kubenswrapper[4699]: I1122 04:30:24.228013 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"6b0a42c8-e8a1-45b3-9f29-77459d98ea4d","Type":"ContainerDied","Data":"a61d8c7fa94c040b3d1169b2c17f00afe5d612c8291396061cb0c712611ae221"} Nov 22 04:30:24 crc kubenswrapper[4699]: I1122 04:30:24.236609 4699 generic.go:334] "Generic (PLEG): container finished" podID="3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c" containerID="c9e75a351eef56340e30a3fd159bb197d4d0e4720077044dccb87ea5eabdb207" exitCode=137 Nov 22 04:30:24 crc kubenswrapper[4699]: I1122 04:30:24.237034 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c","Type":"ContainerDied","Data":"c9e75a351eef56340e30a3fd159bb197d4d0e4720077044dccb87ea5eabdb207"} Nov 22 04:30:24 crc kubenswrapper[4699]: I1122 04:30:24.238682 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c","Type":"ContainerDied","Data":"30cf6bf49210f9b9bdba9fa2d0513ea94a5c2b08560592417a06252b217df8e0"} Nov 22 04:30:24 crc kubenswrapper[4699]: I1122 04:30:24.238712 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30cf6bf49210f9b9bdba9fa2d0513ea94a5c2b08560592417a06252b217df8e0" Nov 22 04:30:24 crc kubenswrapper[4699]: I1122 04:30:24.247088 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 22 04:30:24 crc kubenswrapper[4699]: I1122 04:30:24.255656 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a77dcc1-88a0-4a7d-9e71-aa9a89178683","Type":"ContainerStarted","Data":"9c64e6454dc69e6a16029d8e755c3d2a7b0c6469d93e4615c63dc2e8ac2551dd"} Nov 22 04:30:24 crc kubenswrapper[4699]: I1122 04:30:24.315248 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c-config-data\") pod \"3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c\" (UID: \"3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c\") " Nov 22 04:30:24 crc kubenswrapper[4699]: I1122 04:30:24.315481 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztmtk\" (UniqueName: \"kubernetes.io/projected/3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c-kube-api-access-ztmtk\") pod \"3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c\" (UID: \"3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c\") " Nov 22 04:30:24 crc kubenswrapper[4699]: I1122 04:30:24.315514 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c-combined-ca-bundle\") pod \"3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c\" (UID: \"3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c\") " Nov 22 04:30:24 crc kubenswrapper[4699]: I1122 04:30:24.332868 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c-kube-api-access-ztmtk" (OuterVolumeSpecName: "kube-api-access-ztmtk") pod "3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c" (UID: "3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c"). InnerVolumeSpecName "kube-api-access-ztmtk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:30:24 crc kubenswrapper[4699]: I1122 04:30:24.367861 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c-config-data" (OuterVolumeSpecName: "config-data") pod "3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c" (UID: "3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:30:24 crc kubenswrapper[4699]: I1122 04:30:24.371602 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 22 04:30:24 crc kubenswrapper[4699]: I1122 04:30:24.414567 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c" (UID: "3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:30:24 crc kubenswrapper[4699]: I1122 04:30:24.417847 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:24 crc kubenswrapper[4699]: I1122 04:30:24.417880 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztmtk\" (UniqueName: \"kubernetes.io/projected/3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c-kube-api-access-ztmtk\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:24 crc kubenswrapper[4699]: I1122 04:30:24.417892 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:25 crc kubenswrapper[4699]: I1122 04:30:25.266723 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 22 04:30:25 crc kubenswrapper[4699]: I1122 04:30:25.267326 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"6b0a42c8-e8a1-45b3-9f29-77459d98ea4d","Type":"ContainerStarted","Data":"06badc71b8fe16449b0ee4e3ae39ea512985b4f37781526065efbac707501579"} Nov 22 04:30:25 crc kubenswrapper[4699]: I1122 04:30:25.303605 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 04:30:25 crc kubenswrapper[4699]: I1122 04:30:25.315257 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 04:30:25 crc kubenswrapper[4699]: I1122 04:30:25.340683 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 04:30:25 crc kubenswrapper[4699]: E1122 04:30:25.341197 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c" containerName="nova-cell1-novncproxy-novncproxy" Nov 22 04:30:25 crc kubenswrapper[4699]: I1122 04:30:25.341230 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c" containerName="nova-cell1-novncproxy-novncproxy" Nov 22 04:30:25 crc kubenswrapper[4699]: I1122 04:30:25.341564 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c" containerName="nova-cell1-novncproxy-novncproxy" Nov 22 04:30:25 crc kubenswrapper[4699]: I1122 04:30:25.342654 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 22 04:30:25 crc kubenswrapper[4699]: I1122 04:30:25.345844 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Nov 22 04:30:25 crc kubenswrapper[4699]: I1122 04:30:25.345965 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Nov 22 04:30:25 crc kubenswrapper[4699]: I1122 04:30:25.353570 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Nov 22 04:30:25 crc kubenswrapper[4699]: I1122 04:30:25.364352 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 04:30:25 crc kubenswrapper[4699]: I1122 04:30:25.437297 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a36302e5-6f2a-4c2a-80db-9d02fea03316-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a36302e5-6f2a-4c2a-80db-9d02fea03316\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 04:30:25 crc kubenswrapper[4699]: I1122 04:30:25.437394 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a36302e5-6f2a-4c2a-80db-9d02fea03316-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a36302e5-6f2a-4c2a-80db-9d02fea03316\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 04:30:25 crc kubenswrapper[4699]: I1122 04:30:25.437521 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a36302e5-6f2a-4c2a-80db-9d02fea03316-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a36302e5-6f2a-4c2a-80db-9d02fea03316\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 04:30:25 crc kubenswrapper[4699]: I1122 04:30:25.437556 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a36302e5-6f2a-4c2a-80db-9d02fea03316-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a36302e5-6f2a-4c2a-80db-9d02fea03316\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 04:30:25 crc kubenswrapper[4699]: I1122 04:30:25.437684 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46qp5\" (UniqueName: \"kubernetes.io/projected/a36302e5-6f2a-4c2a-80db-9d02fea03316-kube-api-access-46qp5\") pod \"nova-cell1-novncproxy-0\" (UID: \"a36302e5-6f2a-4c2a-80db-9d02fea03316\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 04:30:25 crc kubenswrapper[4699]: I1122 04:30:25.461528 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c" path="/var/lib/kubelet/pods/3e8d0ac5-ca1e-4a0a-9d3e-deed882a153c/volumes" Nov 22 04:30:25 crc kubenswrapper[4699]: I1122 04:30:25.539889 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46qp5\" (UniqueName: \"kubernetes.io/projected/a36302e5-6f2a-4c2a-80db-9d02fea03316-kube-api-access-46qp5\") pod \"nova-cell1-novncproxy-0\" (UID: \"a36302e5-6f2a-4c2a-80db-9d02fea03316\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 04:30:25 crc kubenswrapper[4699]: I1122 04:30:25.539990 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a36302e5-6f2a-4c2a-80db-9d02fea03316-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a36302e5-6f2a-4c2a-80db-9d02fea03316\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 04:30:25 crc kubenswrapper[4699]: I1122 04:30:25.540122 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a36302e5-6f2a-4c2a-80db-9d02fea03316-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a36302e5-6f2a-4c2a-80db-9d02fea03316\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 04:30:25 crc kubenswrapper[4699]: I1122 04:30:25.540193 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a36302e5-6f2a-4c2a-80db-9d02fea03316-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a36302e5-6f2a-4c2a-80db-9d02fea03316\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 04:30:25 crc kubenswrapper[4699]: I1122 04:30:25.540243 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a36302e5-6f2a-4c2a-80db-9d02fea03316-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a36302e5-6f2a-4c2a-80db-9d02fea03316\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 04:30:25 crc kubenswrapper[4699]: I1122 04:30:25.554500 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a36302e5-6f2a-4c2a-80db-9d02fea03316-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a36302e5-6f2a-4c2a-80db-9d02fea03316\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 04:30:25 crc kubenswrapper[4699]: I1122 04:30:25.555472 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a36302e5-6f2a-4c2a-80db-9d02fea03316-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a36302e5-6f2a-4c2a-80db-9d02fea03316\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 04:30:25 crc kubenswrapper[4699]: I1122 04:30:25.556309 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a36302e5-6f2a-4c2a-80db-9d02fea03316-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a36302e5-6f2a-4c2a-80db-9d02fea03316\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 04:30:25 crc kubenswrapper[4699]: I1122 04:30:25.557278 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a36302e5-6f2a-4c2a-80db-9d02fea03316-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a36302e5-6f2a-4c2a-80db-9d02fea03316\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 04:30:25 crc kubenswrapper[4699]: I1122 04:30:25.564596 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46qp5\" (UniqueName: \"kubernetes.io/projected/a36302e5-6f2a-4c2a-80db-9d02fea03316-kube-api-access-46qp5\") pod \"nova-cell1-novncproxy-0\" (UID: \"a36302e5-6f2a-4c2a-80db-9d02fea03316\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 04:30:25 crc kubenswrapper[4699]: I1122 04:30:25.713401 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 22 04:30:26 crc kubenswrapper[4699]: I1122 04:30:26.191645 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 04:30:26 crc kubenswrapper[4699]: I1122 04:30:26.309747 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"6b0a42c8-e8a1-45b3-9f29-77459d98ea4d","Type":"ContainerStarted","Data":"22d6358d59daae1ee97c75ccd61fcc7e0e9c2382f2f7423b8cd85abfcfc29b1c"} Nov 22 04:30:26 crc kubenswrapper[4699]: I1122 04:30:26.309802 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"6b0a42c8-e8a1-45b3-9f29-77459d98ea4d","Type":"ContainerStarted","Data":"cfd481a71cd4752bf313ff615e11e21b264ce2ebc616b5b5b48cdfcaed8d238d"} Nov 22 04:30:26 crc kubenswrapper[4699]: I1122 04:30:26.314973 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a36302e5-6f2a-4c2a-80db-9d02fea03316","Type":"ContainerStarted","Data":"545c15eff8e4b3898d08f6dbc6f050ed88be6f0b13e81812d6a666288bc4606f"} Nov 22 04:30:26 crc kubenswrapper[4699]: I1122 04:30:26.320197 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a77dcc1-88a0-4a7d-9e71-aa9a89178683","Type":"ContainerStarted","Data":"9eacdfe5f19fa8889d4a6e285e3d028d1202cf921e09f3f68ece219a970e4553"} Nov 22 04:30:26 crc kubenswrapper[4699]: I1122 04:30:26.320242 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 22 04:30:26 crc kubenswrapper[4699]: I1122 04:30:26.352742 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.09038748 podStartE2EDuration="8.352721554s" podCreationTimestamp="2025-11-22 04:30:18 +0000 UTC" firstStartedPulling="2025-11-22 04:30:19.063663196 +0000 UTC m=+1370.406284383" lastFinishedPulling="2025-11-22 04:30:25.32599727 +0000 UTC m=+1376.668618457" observedRunningTime="2025-11-22 04:30:26.340673239 +0000 UTC m=+1377.683294426" watchObservedRunningTime="2025-11-22 04:30:26.352721554 +0000 UTC m=+1377.695342741" Nov 22 04:30:27 crc kubenswrapper[4699]: I1122 04:30:27.332940 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a36302e5-6f2a-4c2a-80db-9d02fea03316","Type":"ContainerStarted","Data":"ebf3fe38b56ec77303b26af6ed03e79be25a90e64a116115fb58697988b291d0"} Nov 22 04:30:27 crc kubenswrapper[4699]: I1122 04:30:27.334707 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-conductor-0" Nov 22 04:30:27 crc kubenswrapper[4699]: I1122 04:30:27.334760 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-conductor-0" Nov 22 04:30:27 crc kubenswrapper[4699]: I1122 04:30:27.355511 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.3554874630000002 podStartE2EDuration="2.355487463s" podCreationTimestamp="2025-11-22 04:30:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:30:27.347539155 +0000 UTC m=+1378.690160352" watchObservedRunningTime="2025-11-22 04:30:27.355487463 +0000 UTC m=+1378.698108650" Nov 22 04:30:27 crc kubenswrapper[4699]: I1122 04:30:27.394106 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-conductor-0" podStartSLOduration=69.435798212 podStartE2EDuration="2m12.394067293s" podCreationTimestamp="2025-11-22 04:28:15 +0000 UTC" firstStartedPulling="2025-11-22 04:28:20.216560397 +0000 UTC m=+1251.559181584" lastFinishedPulling="2025-11-22 04:29:23.174829478 +0000 UTC m=+1314.517450665" observedRunningTime="2025-11-22 04:30:27.387381686 +0000 UTC m=+1378.730002883" watchObservedRunningTime="2025-11-22 04:30:27.394067293 +0000 UTC m=+1378.736688480" Nov 22 04:30:28 crc kubenswrapper[4699]: I1122 04:30:28.851016 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 22 04:30:28 crc kubenswrapper[4699]: I1122 04:30:28.853161 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 22 04:30:28 crc kubenswrapper[4699]: I1122 04:30:28.899645 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 22 04:30:28 crc kubenswrapper[4699]: I1122 04:30:28.900200 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 22 04:30:29 crc kubenswrapper[4699]: I1122 04:30:29.348755 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 22 04:30:29 crc kubenswrapper[4699]: I1122 04:30:29.349644 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-conductor-0" Nov 22 04:30:29 crc kubenswrapper[4699]: I1122 04:30:29.352647 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 22 04:30:29 crc kubenswrapper[4699]: I1122 04:30:29.572309 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-q6kxn"] Nov 22 04:30:29 crc kubenswrapper[4699]: I1122 04:30:29.573905 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-q6kxn" Nov 22 04:30:29 crc kubenswrapper[4699]: I1122 04:30:29.592407 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-q6kxn"] Nov 22 04:30:29 crc kubenswrapper[4699]: I1122 04:30:29.753399 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c75eb722-836a-4b9f-ab34-1dc246154092-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-q6kxn\" (UID: \"c75eb722-836a-4b9f-ab34-1dc246154092\") " pod="openstack/dnsmasq-dns-89c5cd4d5-q6kxn" Nov 22 04:30:29 crc kubenswrapper[4699]: I1122 04:30:29.754169 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c75eb722-836a-4b9f-ab34-1dc246154092-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-q6kxn\" (UID: \"c75eb722-836a-4b9f-ab34-1dc246154092\") " pod="openstack/dnsmasq-dns-89c5cd4d5-q6kxn" Nov 22 04:30:29 crc kubenswrapper[4699]: I1122 04:30:29.754221 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c75eb722-836a-4b9f-ab34-1dc246154092-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-q6kxn\" (UID: \"c75eb722-836a-4b9f-ab34-1dc246154092\") " pod="openstack/dnsmasq-dns-89c5cd4d5-q6kxn" Nov 22 04:30:29 crc kubenswrapper[4699]: I1122 04:30:29.754254 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c75eb722-836a-4b9f-ab34-1dc246154092-config\") pod \"dnsmasq-dns-89c5cd4d5-q6kxn\" (UID: \"c75eb722-836a-4b9f-ab34-1dc246154092\") " pod="openstack/dnsmasq-dns-89c5cd4d5-q6kxn" Nov 22 04:30:29 crc kubenswrapper[4699]: I1122 04:30:29.754364 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c75eb722-836a-4b9f-ab34-1dc246154092-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-q6kxn\" (UID: \"c75eb722-836a-4b9f-ab34-1dc246154092\") " pod="openstack/dnsmasq-dns-89c5cd4d5-q6kxn" Nov 22 04:30:29 crc kubenswrapper[4699]: I1122 04:30:29.754548 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grghm\" (UniqueName: \"kubernetes.io/projected/c75eb722-836a-4b9f-ab34-1dc246154092-kube-api-access-grghm\") pod \"dnsmasq-dns-89c5cd4d5-q6kxn\" (UID: \"c75eb722-836a-4b9f-ab34-1dc246154092\") " pod="openstack/dnsmasq-dns-89c5cd4d5-q6kxn" Nov 22 04:30:29 crc kubenswrapper[4699]: I1122 04:30:29.857008 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grghm\" (UniqueName: \"kubernetes.io/projected/c75eb722-836a-4b9f-ab34-1dc246154092-kube-api-access-grghm\") pod \"dnsmasq-dns-89c5cd4d5-q6kxn\" (UID: \"c75eb722-836a-4b9f-ab34-1dc246154092\") " pod="openstack/dnsmasq-dns-89c5cd4d5-q6kxn" Nov 22 04:30:29 crc kubenswrapper[4699]: I1122 04:30:29.857117 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c75eb722-836a-4b9f-ab34-1dc246154092-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-q6kxn\" (UID: \"c75eb722-836a-4b9f-ab34-1dc246154092\") " pod="openstack/dnsmasq-dns-89c5cd4d5-q6kxn" Nov 22 04:30:29 crc kubenswrapper[4699]: I1122 04:30:29.857161 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c75eb722-836a-4b9f-ab34-1dc246154092-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-q6kxn\" (UID: \"c75eb722-836a-4b9f-ab34-1dc246154092\") " pod="openstack/dnsmasq-dns-89c5cd4d5-q6kxn" Nov 22 04:30:29 crc kubenswrapper[4699]: I1122 04:30:29.857181 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c75eb722-836a-4b9f-ab34-1dc246154092-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-q6kxn\" (UID: \"c75eb722-836a-4b9f-ab34-1dc246154092\") " pod="openstack/dnsmasq-dns-89c5cd4d5-q6kxn" Nov 22 04:30:29 crc kubenswrapper[4699]: I1122 04:30:29.857200 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c75eb722-836a-4b9f-ab34-1dc246154092-config\") pod \"dnsmasq-dns-89c5cd4d5-q6kxn\" (UID: \"c75eb722-836a-4b9f-ab34-1dc246154092\") " pod="openstack/dnsmasq-dns-89c5cd4d5-q6kxn" Nov 22 04:30:29 crc kubenswrapper[4699]: I1122 04:30:29.857240 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c75eb722-836a-4b9f-ab34-1dc246154092-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-q6kxn\" (UID: \"c75eb722-836a-4b9f-ab34-1dc246154092\") " pod="openstack/dnsmasq-dns-89c5cd4d5-q6kxn" Nov 22 04:30:29 crc kubenswrapper[4699]: I1122 04:30:29.858675 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c75eb722-836a-4b9f-ab34-1dc246154092-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-q6kxn\" (UID: \"c75eb722-836a-4b9f-ab34-1dc246154092\") " pod="openstack/dnsmasq-dns-89c5cd4d5-q6kxn" Nov 22 04:30:29 crc kubenswrapper[4699]: I1122 04:30:29.858874 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c75eb722-836a-4b9f-ab34-1dc246154092-config\") pod \"dnsmasq-dns-89c5cd4d5-q6kxn\" (UID: \"c75eb722-836a-4b9f-ab34-1dc246154092\") " pod="openstack/dnsmasq-dns-89c5cd4d5-q6kxn" Nov 22 04:30:29 crc kubenswrapper[4699]: I1122 04:30:29.858887 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c75eb722-836a-4b9f-ab34-1dc246154092-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-q6kxn\" (UID: \"c75eb722-836a-4b9f-ab34-1dc246154092\") " pod="openstack/dnsmasq-dns-89c5cd4d5-q6kxn" Nov 22 04:30:29 crc kubenswrapper[4699]: I1122 04:30:29.858899 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c75eb722-836a-4b9f-ab34-1dc246154092-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-q6kxn\" (UID: \"c75eb722-836a-4b9f-ab34-1dc246154092\") " pod="openstack/dnsmasq-dns-89c5cd4d5-q6kxn" Nov 22 04:30:29 crc kubenswrapper[4699]: I1122 04:30:29.859330 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c75eb722-836a-4b9f-ab34-1dc246154092-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-q6kxn\" (UID: \"c75eb722-836a-4b9f-ab34-1dc246154092\") " pod="openstack/dnsmasq-dns-89c5cd4d5-q6kxn" Nov 22 04:30:29 crc kubenswrapper[4699]: I1122 04:30:29.895631 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grghm\" (UniqueName: \"kubernetes.io/projected/c75eb722-836a-4b9f-ab34-1dc246154092-kube-api-access-grghm\") pod \"dnsmasq-dns-89c5cd4d5-q6kxn\" (UID: \"c75eb722-836a-4b9f-ab34-1dc246154092\") " pod="openstack/dnsmasq-dns-89c5cd4d5-q6kxn" Nov 22 04:30:29 crc kubenswrapper[4699]: I1122 04:30:29.911636 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-q6kxn" Nov 22 04:30:30 crc kubenswrapper[4699]: I1122 04:30:30.360292 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-conductor-0" Nov 22 04:30:30 crc kubenswrapper[4699]: I1122 04:30:30.452349 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-q6kxn"] Nov 22 04:30:30 crc kubenswrapper[4699]: W1122 04:30:30.469574 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc75eb722_836a_4b9f_ab34_1dc246154092.slice/crio-4a46fe12504ab735ab4f905ae19af1688ed8bc2a72cf9d8cef0d6cfa83012ad2 WatchSource:0}: Error finding container 4a46fe12504ab735ab4f905ae19af1688ed8bc2a72cf9d8cef0d6cfa83012ad2: Status 404 returned error can't find the container with id 4a46fe12504ab735ab4f905ae19af1688ed8bc2a72cf9d8cef0d6cfa83012ad2 Nov 22 04:30:30 crc kubenswrapper[4699]: I1122 04:30:30.714358 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Nov 22 04:30:31 crc kubenswrapper[4699]: I1122 04:30:31.378478 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-q6kxn" event={"ID":"c75eb722-836a-4b9f-ab34-1dc246154092","Type":"ContainerDied","Data":"e8018879ba95fce6d4cf8b6ccfdcf328e4361bcc99415fa1d134677c6de11a9d"} Nov 22 04:30:31 crc kubenswrapper[4699]: I1122 04:30:31.378409 4699 generic.go:334] "Generic (PLEG): container finished" podID="c75eb722-836a-4b9f-ab34-1dc246154092" containerID="e8018879ba95fce6d4cf8b6ccfdcf328e4361bcc99415fa1d134677c6de11a9d" exitCode=0 Nov 22 04:30:31 crc kubenswrapper[4699]: I1122 04:30:31.378656 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-q6kxn" event={"ID":"c75eb722-836a-4b9f-ab34-1dc246154092","Type":"ContainerStarted","Data":"4a46fe12504ab735ab4f905ae19af1688ed8bc2a72cf9d8cef0d6cfa83012ad2"} Nov 22 04:30:31 crc kubenswrapper[4699]: I1122 04:30:31.853442 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 22 04:30:32 crc kubenswrapper[4699]: I1122 04:30:32.113738 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 04:30:32 crc kubenswrapper[4699]: I1122 04:30:32.114371 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3a77dcc1-88a0-4a7d-9e71-aa9a89178683" containerName="ceilometer-central-agent" containerID="cri-o://f82d5a858bb1acf2e79549e885d7e7583ac74bc14956db056d35d16a4bac7f11" gracePeriod=30 Nov 22 04:30:32 crc kubenswrapper[4699]: I1122 04:30:32.114563 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3a77dcc1-88a0-4a7d-9e71-aa9a89178683" containerName="proxy-httpd" containerID="cri-o://9eacdfe5f19fa8889d4a6e285e3d028d1202cf921e09f3f68ece219a970e4553" gracePeriod=30 Nov 22 04:30:32 crc kubenswrapper[4699]: I1122 04:30:32.114625 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3a77dcc1-88a0-4a7d-9e71-aa9a89178683" containerName="sg-core" containerID="cri-o://9c64e6454dc69e6a16029d8e755c3d2a7b0c6469d93e4615c63dc2e8ac2551dd" gracePeriod=30 Nov 22 04:30:32 crc kubenswrapper[4699]: I1122 04:30:32.114677 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3a77dcc1-88a0-4a7d-9e71-aa9a89178683" containerName="ceilometer-notification-agent" containerID="cri-o://a0ce3e21fbd0c60abcb1117fd7fd6e2b4cb491b542db1273d4401c6b424f8827" gracePeriod=30 Nov 22 04:30:32 crc kubenswrapper[4699]: I1122 04:30:32.390387 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-q6kxn" event={"ID":"c75eb722-836a-4b9f-ab34-1dc246154092","Type":"ContainerStarted","Data":"b664776f8ddb188385d9c123937da79657c91bb491194600cdf1c2e4f8eb799f"} Nov 22 04:30:32 crc kubenswrapper[4699]: I1122 04:30:32.391353 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-89c5cd4d5-q6kxn" Nov 22 04:30:32 crc kubenswrapper[4699]: I1122 04:30:32.393934 4699 generic.go:334] "Generic (PLEG): container finished" podID="3a77dcc1-88a0-4a7d-9e71-aa9a89178683" containerID="9eacdfe5f19fa8889d4a6e285e3d028d1202cf921e09f3f68ece219a970e4553" exitCode=0 Nov 22 04:30:32 crc kubenswrapper[4699]: I1122 04:30:32.393969 4699 generic.go:334] "Generic (PLEG): container finished" podID="3a77dcc1-88a0-4a7d-9e71-aa9a89178683" containerID="9c64e6454dc69e6a16029d8e755c3d2a7b0c6469d93e4615c63dc2e8ac2551dd" exitCode=2 Nov 22 04:30:32 crc kubenswrapper[4699]: I1122 04:30:32.394010 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a77dcc1-88a0-4a7d-9e71-aa9a89178683","Type":"ContainerDied","Data":"9eacdfe5f19fa8889d4a6e285e3d028d1202cf921e09f3f68ece219a970e4553"} Nov 22 04:30:32 crc kubenswrapper[4699]: I1122 04:30:32.394067 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a77dcc1-88a0-4a7d-9e71-aa9a89178683","Type":"ContainerDied","Data":"9c64e6454dc69e6a16029d8e755c3d2a7b0c6469d93e4615c63dc2e8ac2551dd"} Nov 22 04:30:32 crc kubenswrapper[4699]: I1122 04:30:32.394195 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f7c4c472-2f99-4aac-933b-1f19965b8d06" containerName="nova-api-log" containerID="cri-o://c1c8fb17b423d693ce533542775be48b1b86295e646bae93a5f77ba921d341b4" gracePeriod=30 Nov 22 04:30:32 crc kubenswrapper[4699]: I1122 04:30:32.394239 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f7c4c472-2f99-4aac-933b-1f19965b8d06" containerName="nova-api-api" containerID="cri-o://ddcc30edd8b92f8d4d3f3771b7d156629a4e759ddc820a7b2a89dc9249b0efa1" gracePeriod=30 Nov 22 04:30:32 crc kubenswrapper[4699]: I1122 04:30:32.422050 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-89c5cd4d5-q6kxn" podStartSLOduration=3.422026283 podStartE2EDuration="3.422026283s" podCreationTimestamp="2025-11-22 04:30:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:30:32.413169824 +0000 UTC m=+1383.755791031" watchObservedRunningTime="2025-11-22 04:30:32.422026283 +0000 UTC m=+1383.764647470" Nov 22 04:30:32 crc kubenswrapper[4699]: E1122 04:30:32.614105 4699 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7c4c472_2f99_4aac_933b_1f19965b8d06.slice/crio-conmon-c1c8fb17b423d693ce533542775be48b1b86295e646bae93a5f77ba921d341b4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7c4c472_2f99_4aac_933b_1f19965b8d06.slice/crio-c1c8fb17b423d693ce533542775be48b1b86295e646bae93a5f77ba921d341b4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a77dcc1_88a0_4a7d_9e71_aa9a89178683.slice/crio-conmon-f82d5a858bb1acf2e79549e885d7e7583ac74bc14956db056d35d16a4bac7f11.scope\": RecentStats: unable to find data in memory cache]" Nov 22 04:30:33 crc kubenswrapper[4699]: I1122 04:30:33.420412 4699 generic.go:334] "Generic (PLEG): container finished" podID="f7c4c472-2f99-4aac-933b-1f19965b8d06" containerID="c1c8fb17b423d693ce533542775be48b1b86295e646bae93a5f77ba921d341b4" exitCode=143 Nov 22 04:30:33 crc kubenswrapper[4699]: I1122 04:30:33.420553 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f7c4c472-2f99-4aac-933b-1f19965b8d06","Type":"ContainerDied","Data":"c1c8fb17b423d693ce533542775be48b1b86295e646bae93a5f77ba921d341b4"} Nov 22 04:30:33 crc kubenswrapper[4699]: I1122 04:30:33.425071 4699 generic.go:334] "Generic (PLEG): container finished" podID="3a77dcc1-88a0-4a7d-9e71-aa9a89178683" containerID="a0ce3e21fbd0c60abcb1117fd7fd6e2b4cb491b542db1273d4401c6b424f8827" exitCode=0 Nov 22 04:30:33 crc kubenswrapper[4699]: I1122 04:30:33.425115 4699 generic.go:334] "Generic (PLEG): container finished" podID="3a77dcc1-88a0-4a7d-9e71-aa9a89178683" containerID="f82d5a858bb1acf2e79549e885d7e7583ac74bc14956db056d35d16a4bac7f11" exitCode=0 Nov 22 04:30:33 crc kubenswrapper[4699]: I1122 04:30:33.425655 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a77dcc1-88a0-4a7d-9e71-aa9a89178683","Type":"ContainerDied","Data":"a0ce3e21fbd0c60abcb1117fd7fd6e2b4cb491b542db1273d4401c6b424f8827"} Nov 22 04:30:33 crc kubenswrapper[4699]: I1122 04:30:33.425682 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a77dcc1-88a0-4a7d-9e71-aa9a89178683","Type":"ContainerDied","Data":"f82d5a858bb1acf2e79549e885d7e7583ac74bc14956db056d35d16a4bac7f11"} Nov 22 04:30:33 crc kubenswrapper[4699]: I1122 04:30:33.650886 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 04:30:33 crc kubenswrapper[4699]: I1122 04:30:33.749099 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a77dcc1-88a0-4a7d-9e71-aa9a89178683-run-httpd\") pod \"3a77dcc1-88a0-4a7d-9e71-aa9a89178683\" (UID: \"3a77dcc1-88a0-4a7d-9e71-aa9a89178683\") " Nov 22 04:30:33 crc kubenswrapper[4699]: I1122 04:30:33.749141 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a77dcc1-88a0-4a7d-9e71-aa9a89178683-log-httpd\") pod \"3a77dcc1-88a0-4a7d-9e71-aa9a89178683\" (UID: \"3a77dcc1-88a0-4a7d-9e71-aa9a89178683\") " Nov 22 04:30:33 crc kubenswrapper[4699]: I1122 04:30:33.749218 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a77dcc1-88a0-4a7d-9e71-aa9a89178683-ceilometer-tls-certs\") pod \"3a77dcc1-88a0-4a7d-9e71-aa9a89178683\" (UID: \"3a77dcc1-88a0-4a7d-9e71-aa9a89178683\") " Nov 22 04:30:33 crc kubenswrapper[4699]: I1122 04:30:33.749259 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a77dcc1-88a0-4a7d-9e71-aa9a89178683-scripts\") pod \"3a77dcc1-88a0-4a7d-9e71-aa9a89178683\" (UID: \"3a77dcc1-88a0-4a7d-9e71-aa9a89178683\") " Nov 22 04:30:33 crc kubenswrapper[4699]: I1122 04:30:33.749342 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfj4x\" (UniqueName: \"kubernetes.io/projected/3a77dcc1-88a0-4a7d-9e71-aa9a89178683-kube-api-access-bfj4x\") pod \"3a77dcc1-88a0-4a7d-9e71-aa9a89178683\" (UID: \"3a77dcc1-88a0-4a7d-9e71-aa9a89178683\") " Nov 22 04:30:33 crc kubenswrapper[4699]: I1122 04:30:33.749363 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3a77dcc1-88a0-4a7d-9e71-aa9a89178683-sg-core-conf-yaml\") pod \"3a77dcc1-88a0-4a7d-9e71-aa9a89178683\" (UID: \"3a77dcc1-88a0-4a7d-9e71-aa9a89178683\") " Nov 22 04:30:33 crc kubenswrapper[4699]: I1122 04:30:33.749479 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a77dcc1-88a0-4a7d-9e71-aa9a89178683-combined-ca-bundle\") pod \"3a77dcc1-88a0-4a7d-9e71-aa9a89178683\" (UID: \"3a77dcc1-88a0-4a7d-9e71-aa9a89178683\") " Nov 22 04:30:33 crc kubenswrapper[4699]: I1122 04:30:33.749525 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a77dcc1-88a0-4a7d-9e71-aa9a89178683-config-data\") pod \"3a77dcc1-88a0-4a7d-9e71-aa9a89178683\" (UID: \"3a77dcc1-88a0-4a7d-9e71-aa9a89178683\") " Nov 22 04:30:33 crc kubenswrapper[4699]: I1122 04:30:33.750305 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a77dcc1-88a0-4a7d-9e71-aa9a89178683-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3a77dcc1-88a0-4a7d-9e71-aa9a89178683" (UID: "3a77dcc1-88a0-4a7d-9e71-aa9a89178683"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:30:33 crc kubenswrapper[4699]: I1122 04:30:33.750573 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a77dcc1-88a0-4a7d-9e71-aa9a89178683-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3a77dcc1-88a0-4a7d-9e71-aa9a89178683" (UID: "3a77dcc1-88a0-4a7d-9e71-aa9a89178683"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:30:33 crc kubenswrapper[4699]: I1122 04:30:33.756268 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a77dcc1-88a0-4a7d-9e71-aa9a89178683-scripts" (OuterVolumeSpecName: "scripts") pod "3a77dcc1-88a0-4a7d-9e71-aa9a89178683" (UID: "3a77dcc1-88a0-4a7d-9e71-aa9a89178683"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:30:33 crc kubenswrapper[4699]: I1122 04:30:33.759375 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a77dcc1-88a0-4a7d-9e71-aa9a89178683-kube-api-access-bfj4x" (OuterVolumeSpecName: "kube-api-access-bfj4x") pod "3a77dcc1-88a0-4a7d-9e71-aa9a89178683" (UID: "3a77dcc1-88a0-4a7d-9e71-aa9a89178683"). InnerVolumeSpecName "kube-api-access-bfj4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:30:33 crc kubenswrapper[4699]: I1122 04:30:33.787367 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a77dcc1-88a0-4a7d-9e71-aa9a89178683-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3a77dcc1-88a0-4a7d-9e71-aa9a89178683" (UID: "3a77dcc1-88a0-4a7d-9e71-aa9a89178683"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:30:33 crc kubenswrapper[4699]: I1122 04:30:33.808670 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a77dcc1-88a0-4a7d-9e71-aa9a89178683-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "3a77dcc1-88a0-4a7d-9e71-aa9a89178683" (UID: "3a77dcc1-88a0-4a7d-9e71-aa9a89178683"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:30:33 crc kubenswrapper[4699]: I1122 04:30:33.833503 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a77dcc1-88a0-4a7d-9e71-aa9a89178683-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3a77dcc1-88a0-4a7d-9e71-aa9a89178683" (UID: "3a77dcc1-88a0-4a7d-9e71-aa9a89178683"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:30:33 crc kubenswrapper[4699]: I1122 04:30:33.852174 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfj4x\" (UniqueName: \"kubernetes.io/projected/3a77dcc1-88a0-4a7d-9e71-aa9a89178683-kube-api-access-bfj4x\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:33 crc kubenswrapper[4699]: I1122 04:30:33.852220 4699 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3a77dcc1-88a0-4a7d-9e71-aa9a89178683-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:33 crc kubenswrapper[4699]: I1122 04:30:33.852231 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a77dcc1-88a0-4a7d-9e71-aa9a89178683-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:33 crc kubenswrapper[4699]: I1122 04:30:33.852242 4699 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a77dcc1-88a0-4a7d-9e71-aa9a89178683-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:33 crc kubenswrapper[4699]: I1122 04:30:33.852258 4699 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a77dcc1-88a0-4a7d-9e71-aa9a89178683-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:33 crc kubenswrapper[4699]: I1122 04:30:33.852321 4699 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a77dcc1-88a0-4a7d-9e71-aa9a89178683-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:33 crc kubenswrapper[4699]: I1122 04:30:33.852355 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a77dcc1-88a0-4a7d-9e71-aa9a89178683-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:33 crc kubenswrapper[4699]: I1122 04:30:33.879497 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a77dcc1-88a0-4a7d-9e71-aa9a89178683-config-data" (OuterVolumeSpecName: "config-data") pod "3a77dcc1-88a0-4a7d-9e71-aa9a89178683" (UID: "3a77dcc1-88a0-4a7d-9e71-aa9a89178683"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:30:33 crc kubenswrapper[4699]: I1122 04:30:33.953653 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a77dcc1-88a0-4a7d-9e71-aa9a89178683-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:34 crc kubenswrapper[4699]: I1122 04:30:34.436208 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 04:30:34 crc kubenswrapper[4699]: I1122 04:30:34.436199 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a77dcc1-88a0-4a7d-9e71-aa9a89178683","Type":"ContainerDied","Data":"57d9637b2d292b9e4a4497615cb4946cd4f9ae8644224435b49d3086827bdafa"} Nov 22 04:30:34 crc kubenswrapper[4699]: I1122 04:30:34.437296 4699 scope.go:117] "RemoveContainer" containerID="9eacdfe5f19fa8889d4a6e285e3d028d1202cf921e09f3f68ece219a970e4553" Nov 22 04:30:34 crc kubenswrapper[4699]: I1122 04:30:34.479669 4699 scope.go:117] "RemoveContainer" containerID="9c64e6454dc69e6a16029d8e755c3d2a7b0c6469d93e4615c63dc2e8ac2551dd" Nov 22 04:30:34 crc kubenswrapper[4699]: I1122 04:30:34.479880 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 04:30:34 crc kubenswrapper[4699]: I1122 04:30:34.497131 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 22 04:30:34 crc kubenswrapper[4699]: I1122 04:30:34.508705 4699 scope.go:117] "RemoveContainer" containerID="a0ce3e21fbd0c60abcb1117fd7fd6e2b4cb491b542db1273d4401c6b424f8827" Nov 22 04:30:34 crc kubenswrapper[4699]: I1122 04:30:34.509739 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 22 04:30:34 crc kubenswrapper[4699]: E1122 04:30:34.510203 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a77dcc1-88a0-4a7d-9e71-aa9a89178683" containerName="sg-core" Nov 22 04:30:34 crc kubenswrapper[4699]: I1122 04:30:34.510219 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a77dcc1-88a0-4a7d-9e71-aa9a89178683" containerName="sg-core" Nov 22 04:30:34 crc kubenswrapper[4699]: E1122 04:30:34.510255 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a77dcc1-88a0-4a7d-9e71-aa9a89178683" containerName="ceilometer-central-agent" Nov 22 04:30:34 crc kubenswrapper[4699]: I1122 04:30:34.510264 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a77dcc1-88a0-4a7d-9e71-aa9a89178683" containerName="ceilometer-central-agent" Nov 22 04:30:34 crc kubenswrapper[4699]: E1122 04:30:34.510274 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a77dcc1-88a0-4a7d-9e71-aa9a89178683" containerName="ceilometer-notification-agent" Nov 22 04:30:34 crc kubenswrapper[4699]: I1122 04:30:34.510283 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a77dcc1-88a0-4a7d-9e71-aa9a89178683" containerName="ceilometer-notification-agent" Nov 22 04:30:34 crc kubenswrapper[4699]: E1122 04:30:34.510303 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a77dcc1-88a0-4a7d-9e71-aa9a89178683" containerName="proxy-httpd" Nov 22 04:30:34 crc kubenswrapper[4699]: I1122 04:30:34.510310 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a77dcc1-88a0-4a7d-9e71-aa9a89178683" containerName="proxy-httpd" Nov 22 04:30:34 crc kubenswrapper[4699]: I1122 04:30:34.510557 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a77dcc1-88a0-4a7d-9e71-aa9a89178683" containerName="ceilometer-central-agent" Nov 22 04:30:34 crc kubenswrapper[4699]: I1122 04:30:34.510580 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a77dcc1-88a0-4a7d-9e71-aa9a89178683" containerName="sg-core" Nov 22 04:30:34 crc kubenswrapper[4699]: I1122 04:30:34.510599 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a77dcc1-88a0-4a7d-9e71-aa9a89178683" containerName="ceilometer-notification-agent" Nov 22 04:30:34 crc kubenswrapper[4699]: I1122 04:30:34.510623 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a77dcc1-88a0-4a7d-9e71-aa9a89178683" containerName="proxy-httpd" Nov 22 04:30:34 crc kubenswrapper[4699]: I1122 04:30:34.512497 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 04:30:34 crc kubenswrapper[4699]: I1122 04:30:34.515608 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 22 04:30:34 crc kubenswrapper[4699]: I1122 04:30:34.515608 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 22 04:30:34 crc kubenswrapper[4699]: I1122 04:30:34.515879 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 22 04:30:34 crc kubenswrapper[4699]: I1122 04:30:34.520865 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 04:30:34 crc kubenswrapper[4699]: I1122 04:30:34.551665 4699 scope.go:117] "RemoveContainer" containerID="f82d5a858bb1acf2e79549e885d7e7583ac74bc14956db056d35d16a4bac7f11" Nov 22 04:30:34 crc kubenswrapper[4699]: I1122 04:30:34.667091 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjj7s\" (UniqueName: \"kubernetes.io/projected/7858372b-0809-42b6-a01d-9db6f85d6c90-kube-api-access-wjj7s\") pod \"ceilometer-0\" (UID: \"7858372b-0809-42b6-a01d-9db6f85d6c90\") " pod="openstack/ceilometer-0" Nov 22 04:30:34 crc kubenswrapper[4699]: I1122 04:30:34.667241 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7858372b-0809-42b6-a01d-9db6f85d6c90-config-data\") pod \"ceilometer-0\" (UID: \"7858372b-0809-42b6-a01d-9db6f85d6c90\") " pod="openstack/ceilometer-0" Nov 22 04:30:34 crc kubenswrapper[4699]: I1122 04:30:34.667328 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7858372b-0809-42b6-a01d-9db6f85d6c90-log-httpd\") pod \"ceilometer-0\" (UID: \"7858372b-0809-42b6-a01d-9db6f85d6c90\") " pod="openstack/ceilometer-0" Nov 22 04:30:34 crc kubenswrapper[4699]: I1122 04:30:34.667492 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7858372b-0809-42b6-a01d-9db6f85d6c90-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7858372b-0809-42b6-a01d-9db6f85d6c90\") " pod="openstack/ceilometer-0" Nov 22 04:30:34 crc kubenswrapper[4699]: I1122 04:30:34.667638 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7858372b-0809-42b6-a01d-9db6f85d6c90-run-httpd\") pod \"ceilometer-0\" (UID: \"7858372b-0809-42b6-a01d-9db6f85d6c90\") " pod="openstack/ceilometer-0" Nov 22 04:30:34 crc kubenswrapper[4699]: I1122 04:30:34.667686 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7858372b-0809-42b6-a01d-9db6f85d6c90-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7858372b-0809-42b6-a01d-9db6f85d6c90\") " pod="openstack/ceilometer-0" Nov 22 04:30:34 crc kubenswrapper[4699]: I1122 04:30:34.667808 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7858372b-0809-42b6-a01d-9db6f85d6c90-scripts\") pod \"ceilometer-0\" (UID: \"7858372b-0809-42b6-a01d-9db6f85d6c90\") " pod="openstack/ceilometer-0" Nov 22 04:30:34 crc kubenswrapper[4699]: I1122 04:30:34.667969 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7858372b-0809-42b6-a01d-9db6f85d6c90-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7858372b-0809-42b6-a01d-9db6f85d6c90\") " pod="openstack/ceilometer-0" Nov 22 04:30:34 crc kubenswrapper[4699]: I1122 04:30:34.770504 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7858372b-0809-42b6-a01d-9db6f85d6c90-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7858372b-0809-42b6-a01d-9db6f85d6c90\") " pod="openstack/ceilometer-0" Nov 22 04:30:34 crc kubenswrapper[4699]: I1122 04:30:34.770589 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjj7s\" (UniqueName: \"kubernetes.io/projected/7858372b-0809-42b6-a01d-9db6f85d6c90-kube-api-access-wjj7s\") pod \"ceilometer-0\" (UID: \"7858372b-0809-42b6-a01d-9db6f85d6c90\") " pod="openstack/ceilometer-0" Nov 22 04:30:34 crc kubenswrapper[4699]: I1122 04:30:34.770682 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7858372b-0809-42b6-a01d-9db6f85d6c90-config-data\") pod \"ceilometer-0\" (UID: \"7858372b-0809-42b6-a01d-9db6f85d6c90\") " pod="openstack/ceilometer-0" Nov 22 04:30:34 crc kubenswrapper[4699]: I1122 04:30:34.770719 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7858372b-0809-42b6-a01d-9db6f85d6c90-log-httpd\") pod \"ceilometer-0\" (UID: \"7858372b-0809-42b6-a01d-9db6f85d6c90\") " pod="openstack/ceilometer-0" Nov 22 04:30:34 crc kubenswrapper[4699]: I1122 04:30:34.770774 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7858372b-0809-42b6-a01d-9db6f85d6c90-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7858372b-0809-42b6-a01d-9db6f85d6c90\") " pod="openstack/ceilometer-0" Nov 22 04:30:34 crc kubenswrapper[4699]: I1122 04:30:34.770832 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7858372b-0809-42b6-a01d-9db6f85d6c90-run-httpd\") pod \"ceilometer-0\" (UID: \"7858372b-0809-42b6-a01d-9db6f85d6c90\") " pod="openstack/ceilometer-0" Nov 22 04:30:34 crc kubenswrapper[4699]: I1122 04:30:34.770853 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7858372b-0809-42b6-a01d-9db6f85d6c90-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7858372b-0809-42b6-a01d-9db6f85d6c90\") " pod="openstack/ceilometer-0" Nov 22 04:30:34 crc kubenswrapper[4699]: I1122 04:30:34.771038 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7858372b-0809-42b6-a01d-9db6f85d6c90-scripts\") pod \"ceilometer-0\" (UID: \"7858372b-0809-42b6-a01d-9db6f85d6c90\") " pod="openstack/ceilometer-0" Nov 22 04:30:34 crc kubenswrapper[4699]: I1122 04:30:34.773205 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7858372b-0809-42b6-a01d-9db6f85d6c90-log-httpd\") pod \"ceilometer-0\" (UID: \"7858372b-0809-42b6-a01d-9db6f85d6c90\") " pod="openstack/ceilometer-0" Nov 22 04:30:34 crc kubenswrapper[4699]: I1122 04:30:34.773823 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7858372b-0809-42b6-a01d-9db6f85d6c90-run-httpd\") pod \"ceilometer-0\" (UID: \"7858372b-0809-42b6-a01d-9db6f85d6c90\") " pod="openstack/ceilometer-0" Nov 22 04:30:34 crc kubenswrapper[4699]: I1122 04:30:34.776582 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7858372b-0809-42b6-a01d-9db6f85d6c90-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7858372b-0809-42b6-a01d-9db6f85d6c90\") " pod="openstack/ceilometer-0" Nov 22 04:30:34 crc kubenswrapper[4699]: I1122 04:30:34.776579 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7858372b-0809-42b6-a01d-9db6f85d6c90-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7858372b-0809-42b6-a01d-9db6f85d6c90\") " pod="openstack/ceilometer-0" Nov 22 04:30:34 crc kubenswrapper[4699]: I1122 04:30:34.777089 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7858372b-0809-42b6-a01d-9db6f85d6c90-config-data\") pod \"ceilometer-0\" (UID: \"7858372b-0809-42b6-a01d-9db6f85d6c90\") " pod="openstack/ceilometer-0" Nov 22 04:30:34 crc kubenswrapper[4699]: I1122 04:30:34.789393 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7858372b-0809-42b6-a01d-9db6f85d6c90-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7858372b-0809-42b6-a01d-9db6f85d6c90\") " pod="openstack/ceilometer-0" Nov 22 04:30:34 crc kubenswrapper[4699]: I1122 04:30:34.789713 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjj7s\" (UniqueName: \"kubernetes.io/projected/7858372b-0809-42b6-a01d-9db6f85d6c90-kube-api-access-wjj7s\") pod \"ceilometer-0\" (UID: \"7858372b-0809-42b6-a01d-9db6f85d6c90\") " pod="openstack/ceilometer-0" Nov 22 04:30:34 crc kubenswrapper[4699]: I1122 04:30:34.794165 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7858372b-0809-42b6-a01d-9db6f85d6c90-scripts\") pod \"ceilometer-0\" (UID: \"7858372b-0809-42b6-a01d-9db6f85d6c90\") " pod="openstack/ceilometer-0" Nov 22 04:30:34 crc kubenswrapper[4699]: I1122 04:30:34.836251 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 04:30:35 crc kubenswrapper[4699]: I1122 04:30:35.273412 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 04:30:35 crc kubenswrapper[4699]: W1122 04:30:35.277748 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7858372b_0809_42b6_a01d_9db6f85d6c90.slice/crio-9540a0d3c8410d981b51e3f1276afa6a8dda773e878fea62649cdfd46e5f1631 WatchSource:0}: Error finding container 9540a0d3c8410d981b51e3f1276afa6a8dda773e878fea62649cdfd46e5f1631: Status 404 returned error can't find the container with id 9540a0d3c8410d981b51e3f1276afa6a8dda773e878fea62649cdfd46e5f1631 Nov 22 04:30:35 crc kubenswrapper[4699]: I1122 04:30:35.458925 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a77dcc1-88a0-4a7d-9e71-aa9a89178683" path="/var/lib/kubelet/pods/3a77dcc1-88a0-4a7d-9e71-aa9a89178683/volumes" Nov 22 04:30:35 crc kubenswrapper[4699]: I1122 04:30:35.460188 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7858372b-0809-42b6-a01d-9db6f85d6c90","Type":"ContainerStarted","Data":"9540a0d3c8410d981b51e3f1276afa6a8dda773e878fea62649cdfd46e5f1631"} Nov 22 04:30:35 crc kubenswrapper[4699]: I1122 04:30:35.726290 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Nov 22 04:30:35 crc kubenswrapper[4699]: I1122 04:30:35.939621 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.015652 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.096248 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmv7p\" (UniqueName: \"kubernetes.io/projected/f7c4c472-2f99-4aac-933b-1f19965b8d06-kube-api-access-pmv7p\") pod \"f7c4c472-2f99-4aac-933b-1f19965b8d06\" (UID: \"f7c4c472-2f99-4aac-933b-1f19965b8d06\") " Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.096399 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7c4c472-2f99-4aac-933b-1f19965b8d06-config-data\") pod \"f7c4c472-2f99-4aac-933b-1f19965b8d06\" (UID: \"f7c4c472-2f99-4aac-933b-1f19965b8d06\") " Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.096503 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7c4c472-2f99-4aac-933b-1f19965b8d06-combined-ca-bundle\") pod \"f7c4c472-2f99-4aac-933b-1f19965b8d06\" (UID: \"f7c4c472-2f99-4aac-933b-1f19965b8d06\") " Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.096619 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7c4c472-2f99-4aac-933b-1f19965b8d06-logs\") pod \"f7c4c472-2f99-4aac-933b-1f19965b8d06\" (UID: \"f7c4c472-2f99-4aac-933b-1f19965b8d06\") " Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.097622 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7c4c472-2f99-4aac-933b-1f19965b8d06-logs" (OuterVolumeSpecName: "logs") pod "f7c4c472-2f99-4aac-933b-1f19965b8d06" (UID: "f7c4c472-2f99-4aac-933b-1f19965b8d06"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.101385 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7c4c472-2f99-4aac-933b-1f19965b8d06-kube-api-access-pmv7p" (OuterVolumeSpecName: "kube-api-access-pmv7p") pod "f7c4c472-2f99-4aac-933b-1f19965b8d06" (UID: "f7c4c472-2f99-4aac-933b-1f19965b8d06"). InnerVolumeSpecName "kube-api-access-pmv7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.145674 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7c4c472-2f99-4aac-933b-1f19965b8d06-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f7c4c472-2f99-4aac-933b-1f19965b8d06" (UID: "f7c4c472-2f99-4aac-933b-1f19965b8d06"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.147625 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7c4c472-2f99-4aac-933b-1f19965b8d06-config-data" (OuterVolumeSpecName: "config-data") pod "f7c4c472-2f99-4aac-933b-1f19965b8d06" (UID: "f7c4c472-2f99-4aac-933b-1f19965b8d06"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.200427 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7c4c472-2f99-4aac-933b-1f19965b8d06-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.200494 4699 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7c4c472-2f99-4aac-933b-1f19965b8d06-logs\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.200515 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmv7p\" (UniqueName: \"kubernetes.io/projected/f7c4c472-2f99-4aac-933b-1f19965b8d06-kube-api-access-pmv7p\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.200654 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7c4c472-2f99-4aac-933b-1f19965b8d06-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.465085 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7858372b-0809-42b6-a01d-9db6f85d6c90","Type":"ContainerStarted","Data":"51a792010103209103f14630e020aff923003243fe3fd35a7260487fcde89508"} Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.470484 4699 generic.go:334] "Generic (PLEG): container finished" podID="f7c4c472-2f99-4aac-933b-1f19965b8d06" containerID="ddcc30edd8b92f8d4d3f3771b7d156629a4e759ddc820a7b2a89dc9249b0efa1" exitCode=0 Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.470597 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f7c4c472-2f99-4aac-933b-1f19965b8d06","Type":"ContainerDied","Data":"ddcc30edd8b92f8d4d3f3771b7d156629a4e759ddc820a7b2a89dc9249b0efa1"} Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.470682 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f7c4c472-2f99-4aac-933b-1f19965b8d06","Type":"ContainerDied","Data":"e9ee9e7d6a11b1bbf879ff53224a079734d73db7ea4b7d22fd4bad2bafb3d14f"} Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.470729 4699 scope.go:117] "RemoveContainer" containerID="ddcc30edd8b92f8d4d3f3771b7d156629a4e759ddc820a7b2a89dc9249b0efa1" Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.470726 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.497840 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.646396 4699 scope.go:117] "RemoveContainer" containerID="c1c8fb17b423d693ce533542775be48b1b86295e646bae93a5f77ba921d341b4" Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.692991 4699 scope.go:117] "RemoveContainer" containerID="ddcc30edd8b92f8d4d3f3771b7d156629a4e759ddc820a7b2a89dc9249b0efa1" Nov 22 04:30:36 crc kubenswrapper[4699]: E1122 04:30:36.693633 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddcc30edd8b92f8d4d3f3771b7d156629a4e759ddc820a7b2a89dc9249b0efa1\": container with ID starting with ddcc30edd8b92f8d4d3f3771b7d156629a4e759ddc820a7b2a89dc9249b0efa1 not found: ID does not exist" containerID="ddcc30edd8b92f8d4d3f3771b7d156629a4e759ddc820a7b2a89dc9249b0efa1" Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.693697 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddcc30edd8b92f8d4d3f3771b7d156629a4e759ddc820a7b2a89dc9249b0efa1"} err="failed to get container status \"ddcc30edd8b92f8d4d3f3771b7d156629a4e759ddc820a7b2a89dc9249b0efa1\": rpc error: code = NotFound desc = could not find container \"ddcc30edd8b92f8d4d3f3771b7d156629a4e759ddc820a7b2a89dc9249b0efa1\": container with ID starting with ddcc30edd8b92f8d4d3f3771b7d156629a4e759ddc820a7b2a89dc9249b0efa1 not found: ID does not exist" Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.693722 4699 scope.go:117] "RemoveContainer" containerID="c1c8fb17b423d693ce533542775be48b1b86295e646bae93a5f77ba921d341b4" Nov 22 04:30:36 crc kubenswrapper[4699]: E1122 04:30:36.693946 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1c8fb17b423d693ce533542775be48b1b86295e646bae93a5f77ba921d341b4\": container with ID starting with c1c8fb17b423d693ce533542775be48b1b86295e646bae93a5f77ba921d341b4 not found: ID does not exist" containerID="c1c8fb17b423d693ce533542775be48b1b86295e646bae93a5f77ba921d341b4" Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.694539 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1c8fb17b423d693ce533542775be48b1b86295e646bae93a5f77ba921d341b4"} err="failed to get container status \"c1c8fb17b423d693ce533542775be48b1b86295e646bae93a5f77ba921d341b4\": rpc error: code = NotFound desc = could not find container \"c1c8fb17b423d693ce533542775be48b1b86295e646bae93a5f77ba921d341b4\": container with ID starting with c1c8fb17b423d693ce533542775be48b1b86295e646bae93a5f77ba921d341b4 not found: ID does not exist" Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.703879 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.726593 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.735617 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 22 04:30:36 crc kubenswrapper[4699]: E1122 04:30:36.736363 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7c4c472-2f99-4aac-933b-1f19965b8d06" containerName="nova-api-log" Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.736383 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7c4c472-2f99-4aac-933b-1f19965b8d06" containerName="nova-api-log" Nov 22 04:30:36 crc kubenswrapper[4699]: E1122 04:30:36.736407 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7c4c472-2f99-4aac-933b-1f19965b8d06" containerName="nova-api-api" Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.736414 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7c4c472-2f99-4aac-933b-1f19965b8d06" containerName="nova-api-api" Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.736627 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7c4c472-2f99-4aac-933b-1f19965b8d06" containerName="nova-api-log" Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.736648 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7c4c472-2f99-4aac-933b-1f19965b8d06" containerName="nova-api-api" Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.737702 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.740891 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.741053 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.742560 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.752970 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.810750 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/205155e1-ef6e-43ce-9783-9abf9acd3a7b-config-data\") pod \"nova-api-0\" (UID: \"205155e1-ef6e-43ce-9783-9abf9acd3a7b\") " pod="openstack/nova-api-0" Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.810802 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/205155e1-ef6e-43ce-9783-9abf9acd3a7b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"205155e1-ef6e-43ce-9783-9abf9acd3a7b\") " pod="openstack/nova-api-0" Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.810879 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjm6j\" (UniqueName: \"kubernetes.io/projected/205155e1-ef6e-43ce-9783-9abf9acd3a7b-kube-api-access-sjm6j\") pod \"nova-api-0\" (UID: \"205155e1-ef6e-43ce-9783-9abf9acd3a7b\") " pod="openstack/nova-api-0" Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.811147 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/205155e1-ef6e-43ce-9783-9abf9acd3a7b-logs\") pod \"nova-api-0\" (UID: \"205155e1-ef6e-43ce-9783-9abf9acd3a7b\") " pod="openstack/nova-api-0" Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.811347 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/205155e1-ef6e-43ce-9783-9abf9acd3a7b-public-tls-certs\") pod \"nova-api-0\" (UID: \"205155e1-ef6e-43ce-9783-9abf9acd3a7b\") " pod="openstack/nova-api-0" Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.811414 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/205155e1-ef6e-43ce-9783-9abf9acd3a7b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"205155e1-ef6e-43ce-9783-9abf9acd3a7b\") " pod="openstack/nova-api-0" Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.912796 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/205155e1-ef6e-43ce-9783-9abf9acd3a7b-logs\") pod \"nova-api-0\" (UID: \"205155e1-ef6e-43ce-9783-9abf9acd3a7b\") " pod="openstack/nova-api-0" Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.912877 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/205155e1-ef6e-43ce-9783-9abf9acd3a7b-public-tls-certs\") pod \"nova-api-0\" (UID: \"205155e1-ef6e-43ce-9783-9abf9acd3a7b\") " pod="openstack/nova-api-0" Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.912907 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/205155e1-ef6e-43ce-9783-9abf9acd3a7b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"205155e1-ef6e-43ce-9783-9abf9acd3a7b\") " pod="openstack/nova-api-0" Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.912936 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/205155e1-ef6e-43ce-9783-9abf9acd3a7b-config-data\") pod \"nova-api-0\" (UID: \"205155e1-ef6e-43ce-9783-9abf9acd3a7b\") " pod="openstack/nova-api-0" Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.912954 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/205155e1-ef6e-43ce-9783-9abf9acd3a7b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"205155e1-ef6e-43ce-9783-9abf9acd3a7b\") " pod="openstack/nova-api-0" Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.913018 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjm6j\" (UniqueName: \"kubernetes.io/projected/205155e1-ef6e-43ce-9783-9abf9acd3a7b-kube-api-access-sjm6j\") pod \"nova-api-0\" (UID: \"205155e1-ef6e-43ce-9783-9abf9acd3a7b\") " pod="openstack/nova-api-0" Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.914576 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/205155e1-ef6e-43ce-9783-9abf9acd3a7b-logs\") pod \"nova-api-0\" (UID: \"205155e1-ef6e-43ce-9783-9abf9acd3a7b\") " pod="openstack/nova-api-0" Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.918963 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/205155e1-ef6e-43ce-9783-9abf9acd3a7b-public-tls-certs\") pod \"nova-api-0\" (UID: \"205155e1-ef6e-43ce-9783-9abf9acd3a7b\") " pod="openstack/nova-api-0" Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.920208 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/205155e1-ef6e-43ce-9783-9abf9acd3a7b-config-data\") pod \"nova-api-0\" (UID: \"205155e1-ef6e-43ce-9783-9abf9acd3a7b\") " pod="openstack/nova-api-0" Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.921189 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/205155e1-ef6e-43ce-9783-9abf9acd3a7b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"205155e1-ef6e-43ce-9783-9abf9acd3a7b\") " pod="openstack/nova-api-0" Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.921841 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/205155e1-ef6e-43ce-9783-9abf9acd3a7b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"205155e1-ef6e-43ce-9783-9abf9acd3a7b\") " pod="openstack/nova-api-0" Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.934300 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-s776x"] Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.936169 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-s776x" Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.937990 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.941979 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjm6j\" (UniqueName: \"kubernetes.io/projected/205155e1-ef6e-43ce-9783-9abf9acd3a7b-kube-api-access-sjm6j\") pod \"nova-api-0\" (UID: \"205155e1-ef6e-43ce-9783-9abf9acd3a7b\") " pod="openstack/nova-api-0" Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.942160 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Nov 22 04:30:36 crc kubenswrapper[4699]: I1122 04:30:36.944691 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-s776x"] Nov 22 04:30:37 crc kubenswrapper[4699]: I1122 04:30:37.014979 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5273f2af-0355-484d-a907-589de1193a32-scripts\") pod \"nova-cell1-cell-mapping-s776x\" (UID: \"5273f2af-0355-484d-a907-589de1193a32\") " pod="openstack/nova-cell1-cell-mapping-s776x" Nov 22 04:30:37 crc kubenswrapper[4699]: I1122 04:30:37.015060 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5273f2af-0355-484d-a907-589de1193a32-config-data\") pod \"nova-cell1-cell-mapping-s776x\" (UID: \"5273f2af-0355-484d-a907-589de1193a32\") " pod="openstack/nova-cell1-cell-mapping-s776x" Nov 22 04:30:37 crc kubenswrapper[4699]: I1122 04:30:37.015090 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx96b\" (UniqueName: \"kubernetes.io/projected/5273f2af-0355-484d-a907-589de1193a32-kube-api-access-dx96b\") pod \"nova-cell1-cell-mapping-s776x\" (UID: \"5273f2af-0355-484d-a907-589de1193a32\") " pod="openstack/nova-cell1-cell-mapping-s776x" Nov 22 04:30:37 crc kubenswrapper[4699]: I1122 04:30:37.015361 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5273f2af-0355-484d-a907-589de1193a32-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-s776x\" (UID: \"5273f2af-0355-484d-a907-589de1193a32\") " pod="openstack/nova-cell1-cell-mapping-s776x" Nov 22 04:30:37 crc kubenswrapper[4699]: I1122 04:30:37.075996 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 04:30:37 crc kubenswrapper[4699]: I1122 04:30:37.117619 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5273f2af-0355-484d-a907-589de1193a32-scripts\") pod \"nova-cell1-cell-mapping-s776x\" (UID: \"5273f2af-0355-484d-a907-589de1193a32\") " pod="openstack/nova-cell1-cell-mapping-s776x" Nov 22 04:30:37 crc kubenswrapper[4699]: I1122 04:30:37.117733 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5273f2af-0355-484d-a907-589de1193a32-config-data\") pod \"nova-cell1-cell-mapping-s776x\" (UID: \"5273f2af-0355-484d-a907-589de1193a32\") " pod="openstack/nova-cell1-cell-mapping-s776x" Nov 22 04:30:37 crc kubenswrapper[4699]: I1122 04:30:37.117768 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dx96b\" (UniqueName: \"kubernetes.io/projected/5273f2af-0355-484d-a907-589de1193a32-kube-api-access-dx96b\") pod \"nova-cell1-cell-mapping-s776x\" (UID: \"5273f2af-0355-484d-a907-589de1193a32\") " pod="openstack/nova-cell1-cell-mapping-s776x" Nov 22 04:30:37 crc kubenswrapper[4699]: I1122 04:30:37.117900 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5273f2af-0355-484d-a907-589de1193a32-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-s776x\" (UID: \"5273f2af-0355-484d-a907-589de1193a32\") " pod="openstack/nova-cell1-cell-mapping-s776x" Nov 22 04:30:37 crc kubenswrapper[4699]: I1122 04:30:37.123302 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5273f2af-0355-484d-a907-589de1193a32-scripts\") pod \"nova-cell1-cell-mapping-s776x\" (UID: \"5273f2af-0355-484d-a907-589de1193a32\") " pod="openstack/nova-cell1-cell-mapping-s776x" Nov 22 04:30:37 crc kubenswrapper[4699]: I1122 04:30:37.123883 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5273f2af-0355-484d-a907-589de1193a32-config-data\") pod \"nova-cell1-cell-mapping-s776x\" (UID: \"5273f2af-0355-484d-a907-589de1193a32\") " pod="openstack/nova-cell1-cell-mapping-s776x" Nov 22 04:30:37 crc kubenswrapper[4699]: I1122 04:30:37.131595 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5273f2af-0355-484d-a907-589de1193a32-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-s776x\" (UID: \"5273f2af-0355-484d-a907-589de1193a32\") " pod="openstack/nova-cell1-cell-mapping-s776x" Nov 22 04:30:37 crc kubenswrapper[4699]: I1122 04:30:37.139724 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx96b\" (UniqueName: \"kubernetes.io/projected/5273f2af-0355-484d-a907-589de1193a32-kube-api-access-dx96b\") pod \"nova-cell1-cell-mapping-s776x\" (UID: \"5273f2af-0355-484d-a907-589de1193a32\") " pod="openstack/nova-cell1-cell-mapping-s776x" Nov 22 04:30:37 crc kubenswrapper[4699]: I1122 04:30:37.311937 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-s776x" Nov 22 04:30:37 crc kubenswrapper[4699]: I1122 04:30:37.465931 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7c4c472-2f99-4aac-933b-1f19965b8d06" path="/var/lib/kubelet/pods/f7c4c472-2f99-4aac-933b-1f19965b8d06/volumes" Nov 22 04:30:37 crc kubenswrapper[4699]: I1122 04:30:37.506448 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7858372b-0809-42b6-a01d-9db6f85d6c90","Type":"ContainerStarted","Data":"aa434a2f9a2bbe40a109bb86a57370cf2af66ab9eae990a781d85d9cc4553ab8"} Nov 22 04:30:37 crc kubenswrapper[4699]: I1122 04:30:37.597808 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 04:30:37 crc kubenswrapper[4699]: I1122 04:30:37.841109 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-s776x"] Nov 22 04:30:37 crc kubenswrapper[4699]: W1122 04:30:37.842969 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5273f2af_0355_484d_a907_589de1193a32.slice/crio-b94cb497db955deb43b66fd56529419a7aabf17a4c8f02a36501778402951ea9 WatchSource:0}: Error finding container b94cb497db955deb43b66fd56529419a7aabf17a4c8f02a36501778402951ea9: Status 404 returned error can't find the container with id b94cb497db955deb43b66fd56529419a7aabf17a4c8f02a36501778402951ea9 Nov 22 04:30:38 crc kubenswrapper[4699]: I1122 04:30:38.518039 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"205155e1-ef6e-43ce-9783-9abf9acd3a7b","Type":"ContainerStarted","Data":"ff69fe52ec6555f9a7e144cf85da4729085c535aebe38820b72c73cf7a0c6de7"} Nov 22 04:30:38 crc kubenswrapper[4699]: I1122 04:30:38.518091 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"205155e1-ef6e-43ce-9783-9abf9acd3a7b","Type":"ContainerStarted","Data":"7cb2057f0f622116d8c5bc910da81fa37d6841c628264269dd5b174bde3aaaa0"} Nov 22 04:30:38 crc kubenswrapper[4699]: I1122 04:30:38.518102 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"205155e1-ef6e-43ce-9783-9abf9acd3a7b","Type":"ContainerStarted","Data":"0c5245f7a36b0a6ef946d72e5222bcf43441792fbd24a7791ed0db23435cf146"} Nov 22 04:30:38 crc kubenswrapper[4699]: I1122 04:30:38.521129 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-s776x" event={"ID":"5273f2af-0355-484d-a907-589de1193a32","Type":"ContainerStarted","Data":"9c3808a684aff428d27566e481a7cbb608f3db49ad4dc5f9d4586582693fc445"} Nov 22 04:30:38 crc kubenswrapper[4699]: I1122 04:30:38.521181 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-s776x" event={"ID":"5273f2af-0355-484d-a907-589de1193a32","Type":"ContainerStarted","Data":"b94cb497db955deb43b66fd56529419a7aabf17a4c8f02a36501778402951ea9"} Nov 22 04:30:38 crc kubenswrapper[4699]: I1122 04:30:38.524017 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7858372b-0809-42b6-a01d-9db6f85d6c90","Type":"ContainerStarted","Data":"80083dc30437e6f929a23bc7cac8cf97b8ba549fd8a7798d8165132376766c53"} Nov 22 04:30:38 crc kubenswrapper[4699]: I1122 04:30:38.546124 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.546099474 podStartE2EDuration="2.546099474s" podCreationTimestamp="2025-11-22 04:30:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:30:38.537187154 +0000 UTC m=+1389.879808351" watchObservedRunningTime="2025-11-22 04:30:38.546099474 +0000 UTC m=+1389.888720661" Nov 22 04:30:38 crc kubenswrapper[4699]: I1122 04:30:38.556018 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-s776x" podStartSLOduration=2.555997278 podStartE2EDuration="2.555997278s" podCreationTimestamp="2025-11-22 04:30:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:30:38.552835193 +0000 UTC m=+1389.895456390" watchObservedRunningTime="2025-11-22 04:30:38.555997278 +0000 UTC m=+1389.898618465" Nov 22 04:30:39 crc kubenswrapper[4699]: I1122 04:30:39.538852 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7858372b-0809-42b6-a01d-9db6f85d6c90","Type":"ContainerStarted","Data":"2c8716754e4b2eca237e44bc50636274ef17706ea75f48ed4e17bc33cdff036b"} Nov 22 04:30:39 crc kubenswrapper[4699]: I1122 04:30:39.568371 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.800620577 podStartE2EDuration="5.568349283s" podCreationTimestamp="2025-11-22 04:30:34 +0000 UTC" firstStartedPulling="2025-11-22 04:30:35.279765358 +0000 UTC m=+1386.622386545" lastFinishedPulling="2025-11-22 04:30:39.047494064 +0000 UTC m=+1390.390115251" observedRunningTime="2025-11-22 04:30:39.567072023 +0000 UTC m=+1390.909693230" watchObservedRunningTime="2025-11-22 04:30:39.568349283 +0000 UTC m=+1390.910970470" Nov 22 04:30:39 crc kubenswrapper[4699]: I1122 04:30:39.913663 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-89c5cd4d5-q6kxn" Nov 22 04:30:39 crc kubenswrapper[4699]: I1122 04:30:39.993940 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-m55tb"] Nov 22 04:30:39 crc kubenswrapper[4699]: I1122 04:30:39.994182 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757b4f8459-m55tb" podUID="b73072a8-266e-4352-8f8a-371f64be1988" containerName="dnsmasq-dns" containerID="cri-o://dffbde150b14f2f342df64f5c7e4cfd56ab88a5e00d59a77ba5240fb57ce1fa3" gracePeriod=10 Nov 22 04:30:40 crc kubenswrapper[4699]: I1122 04:30:40.561200 4699 generic.go:334] "Generic (PLEG): container finished" podID="b73072a8-266e-4352-8f8a-371f64be1988" containerID="dffbde150b14f2f342df64f5c7e4cfd56ab88a5e00d59a77ba5240fb57ce1fa3" exitCode=0 Nov 22 04:30:40 crc kubenswrapper[4699]: I1122 04:30:40.561286 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-m55tb" event={"ID":"b73072a8-266e-4352-8f8a-371f64be1988","Type":"ContainerDied","Data":"dffbde150b14f2f342df64f5c7e4cfd56ab88a5e00d59a77ba5240fb57ce1fa3"} Nov 22 04:30:40 crc kubenswrapper[4699]: I1122 04:30:40.561619 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-m55tb" event={"ID":"b73072a8-266e-4352-8f8a-371f64be1988","Type":"ContainerDied","Data":"0f10df95e4ab69fa6c7043fb44e2e61e898c87df9ab517ceae3cd92ccaf1b686"} Nov 22 04:30:40 crc kubenswrapper[4699]: I1122 04:30:40.561633 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f10df95e4ab69fa6c7043fb44e2e61e898c87df9ab517ceae3cd92ccaf1b686" Nov 22 04:30:40 crc kubenswrapper[4699]: I1122 04:30:40.561965 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 22 04:30:40 crc kubenswrapper[4699]: I1122 04:30:40.592348 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-m55tb" Nov 22 04:30:40 crc kubenswrapper[4699]: I1122 04:30:40.732263 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b73072a8-266e-4352-8f8a-371f64be1988-dns-svc\") pod \"b73072a8-266e-4352-8f8a-371f64be1988\" (UID: \"b73072a8-266e-4352-8f8a-371f64be1988\") " Nov 22 04:30:40 crc kubenswrapper[4699]: I1122 04:30:40.732771 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b73072a8-266e-4352-8f8a-371f64be1988-config\") pod \"b73072a8-266e-4352-8f8a-371f64be1988\" (UID: \"b73072a8-266e-4352-8f8a-371f64be1988\") " Nov 22 04:30:40 crc kubenswrapper[4699]: I1122 04:30:40.733022 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b73072a8-266e-4352-8f8a-371f64be1988-ovsdbserver-nb\") pod \"b73072a8-266e-4352-8f8a-371f64be1988\" (UID: \"b73072a8-266e-4352-8f8a-371f64be1988\") " Nov 22 04:30:40 crc kubenswrapper[4699]: I1122 04:30:40.733172 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h85td\" (UniqueName: \"kubernetes.io/projected/b73072a8-266e-4352-8f8a-371f64be1988-kube-api-access-h85td\") pod \"b73072a8-266e-4352-8f8a-371f64be1988\" (UID: \"b73072a8-266e-4352-8f8a-371f64be1988\") " Nov 22 04:30:40 crc kubenswrapper[4699]: I1122 04:30:40.733270 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b73072a8-266e-4352-8f8a-371f64be1988-ovsdbserver-sb\") pod \"b73072a8-266e-4352-8f8a-371f64be1988\" (UID: \"b73072a8-266e-4352-8f8a-371f64be1988\") " Nov 22 04:30:40 crc kubenswrapper[4699]: I1122 04:30:40.733391 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b73072a8-266e-4352-8f8a-371f64be1988-dns-swift-storage-0\") pod \"b73072a8-266e-4352-8f8a-371f64be1988\" (UID: \"b73072a8-266e-4352-8f8a-371f64be1988\") " Nov 22 04:30:40 crc kubenswrapper[4699]: I1122 04:30:40.740190 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b73072a8-266e-4352-8f8a-371f64be1988-kube-api-access-h85td" (OuterVolumeSpecName: "kube-api-access-h85td") pod "b73072a8-266e-4352-8f8a-371f64be1988" (UID: "b73072a8-266e-4352-8f8a-371f64be1988"). InnerVolumeSpecName "kube-api-access-h85td". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:30:40 crc kubenswrapper[4699]: I1122 04:30:40.782918 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b73072a8-266e-4352-8f8a-371f64be1988-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b73072a8-266e-4352-8f8a-371f64be1988" (UID: "b73072a8-266e-4352-8f8a-371f64be1988"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:30:40 crc kubenswrapper[4699]: I1122 04:30:40.790065 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b73072a8-266e-4352-8f8a-371f64be1988-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b73072a8-266e-4352-8f8a-371f64be1988" (UID: "b73072a8-266e-4352-8f8a-371f64be1988"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:30:40 crc kubenswrapper[4699]: I1122 04:30:40.796245 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b73072a8-266e-4352-8f8a-371f64be1988-config" (OuterVolumeSpecName: "config") pod "b73072a8-266e-4352-8f8a-371f64be1988" (UID: "b73072a8-266e-4352-8f8a-371f64be1988"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:30:40 crc kubenswrapper[4699]: I1122 04:30:40.811672 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b73072a8-266e-4352-8f8a-371f64be1988-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b73072a8-266e-4352-8f8a-371f64be1988" (UID: "b73072a8-266e-4352-8f8a-371f64be1988"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:30:40 crc kubenswrapper[4699]: I1122 04:30:40.811698 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b73072a8-266e-4352-8f8a-371f64be1988-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b73072a8-266e-4352-8f8a-371f64be1988" (UID: "b73072a8-266e-4352-8f8a-371f64be1988"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:30:40 crc kubenswrapper[4699]: I1122 04:30:40.835637 4699 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b73072a8-266e-4352-8f8a-371f64be1988-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:40 crc kubenswrapper[4699]: I1122 04:30:40.835680 4699 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b73072a8-266e-4352-8f8a-371f64be1988-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:40 crc kubenswrapper[4699]: I1122 04:30:40.835692 4699 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b73072a8-266e-4352-8f8a-371f64be1988-config\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:40 crc kubenswrapper[4699]: I1122 04:30:40.835703 4699 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b73072a8-266e-4352-8f8a-371f64be1988-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:40 crc kubenswrapper[4699]: I1122 04:30:40.835715 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h85td\" (UniqueName: \"kubernetes.io/projected/b73072a8-266e-4352-8f8a-371f64be1988-kube-api-access-h85td\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:40 crc kubenswrapper[4699]: I1122 04:30:40.835729 4699 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b73072a8-266e-4352-8f8a-371f64be1988-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:41 crc kubenswrapper[4699]: I1122 04:30:41.570150 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-m55tb" Nov 22 04:30:41 crc kubenswrapper[4699]: I1122 04:30:41.602033 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-m55tb"] Nov 22 04:30:41 crc kubenswrapper[4699]: I1122 04:30:41.613981 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-m55tb"] Nov 22 04:30:43 crc kubenswrapper[4699]: I1122 04:30:43.460811 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b73072a8-266e-4352-8f8a-371f64be1988" path="/var/lib/kubelet/pods/b73072a8-266e-4352-8f8a-371f64be1988/volumes" Nov 22 04:30:43 crc kubenswrapper[4699]: I1122 04:30:43.587700 4699 generic.go:334] "Generic (PLEG): container finished" podID="5273f2af-0355-484d-a907-589de1193a32" containerID="9c3808a684aff428d27566e481a7cbb608f3db49ad4dc5f9d4586582693fc445" exitCode=0 Nov 22 04:30:43 crc kubenswrapper[4699]: I1122 04:30:43.587740 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-s776x" event={"ID":"5273f2af-0355-484d-a907-589de1193a32","Type":"ContainerDied","Data":"9c3808a684aff428d27566e481a7cbb608f3db49ad4dc5f9d4586582693fc445"} Nov 22 04:30:44 crc kubenswrapper[4699]: I1122 04:30:44.967678 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-s776x" Nov 22 04:30:45 crc kubenswrapper[4699]: I1122 04:30:45.131218 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5273f2af-0355-484d-a907-589de1193a32-combined-ca-bundle\") pod \"5273f2af-0355-484d-a907-589de1193a32\" (UID: \"5273f2af-0355-484d-a907-589de1193a32\") " Nov 22 04:30:45 crc kubenswrapper[4699]: I1122 04:30:45.131880 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5273f2af-0355-484d-a907-589de1193a32-scripts\") pod \"5273f2af-0355-484d-a907-589de1193a32\" (UID: \"5273f2af-0355-484d-a907-589de1193a32\") " Nov 22 04:30:45 crc kubenswrapper[4699]: I1122 04:30:45.132041 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5273f2af-0355-484d-a907-589de1193a32-config-data\") pod \"5273f2af-0355-484d-a907-589de1193a32\" (UID: \"5273f2af-0355-484d-a907-589de1193a32\") " Nov 22 04:30:45 crc kubenswrapper[4699]: I1122 04:30:45.132258 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dx96b\" (UniqueName: \"kubernetes.io/projected/5273f2af-0355-484d-a907-589de1193a32-kube-api-access-dx96b\") pod \"5273f2af-0355-484d-a907-589de1193a32\" (UID: \"5273f2af-0355-484d-a907-589de1193a32\") " Nov 22 04:30:45 crc kubenswrapper[4699]: I1122 04:30:45.137428 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5273f2af-0355-484d-a907-589de1193a32-scripts" (OuterVolumeSpecName: "scripts") pod "5273f2af-0355-484d-a907-589de1193a32" (UID: "5273f2af-0355-484d-a907-589de1193a32"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:30:45 crc kubenswrapper[4699]: I1122 04:30:45.137664 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5273f2af-0355-484d-a907-589de1193a32-kube-api-access-dx96b" (OuterVolumeSpecName: "kube-api-access-dx96b") pod "5273f2af-0355-484d-a907-589de1193a32" (UID: "5273f2af-0355-484d-a907-589de1193a32"). InnerVolumeSpecName "kube-api-access-dx96b". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:30:45 crc kubenswrapper[4699]: I1122 04:30:45.159421 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5273f2af-0355-484d-a907-589de1193a32-config-data" (OuterVolumeSpecName: "config-data") pod "5273f2af-0355-484d-a907-589de1193a32" (UID: "5273f2af-0355-484d-a907-589de1193a32"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:30:45 crc kubenswrapper[4699]: I1122 04:30:45.163295 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5273f2af-0355-484d-a907-589de1193a32-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5273f2af-0355-484d-a907-589de1193a32" (UID: "5273f2af-0355-484d-a907-589de1193a32"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:30:45 crc kubenswrapper[4699]: I1122 04:30:45.235205 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dx96b\" (UniqueName: \"kubernetes.io/projected/5273f2af-0355-484d-a907-589de1193a32-kube-api-access-dx96b\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:45 crc kubenswrapper[4699]: I1122 04:30:45.235244 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5273f2af-0355-484d-a907-589de1193a32-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:45 crc kubenswrapper[4699]: I1122 04:30:45.235253 4699 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5273f2af-0355-484d-a907-589de1193a32-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:45 crc kubenswrapper[4699]: I1122 04:30:45.235261 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5273f2af-0355-484d-a907-589de1193a32-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:45 crc kubenswrapper[4699]: I1122 04:30:45.605984 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-s776x" event={"ID":"5273f2af-0355-484d-a907-589de1193a32","Type":"ContainerDied","Data":"b94cb497db955deb43b66fd56529419a7aabf17a4c8f02a36501778402951ea9"} Nov 22 04:30:45 crc kubenswrapper[4699]: I1122 04:30:45.606026 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b94cb497db955deb43b66fd56529419a7aabf17a4c8f02a36501778402951ea9" Nov 22 04:30:45 crc kubenswrapper[4699]: I1122 04:30:45.606041 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-s776x" Nov 22 04:30:45 crc kubenswrapper[4699]: I1122 04:30:45.778257 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 22 04:30:45 crc kubenswrapper[4699]: I1122 04:30:45.778660 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="205155e1-ef6e-43ce-9783-9abf9acd3a7b" containerName="nova-api-log" containerID="cri-o://7cb2057f0f622116d8c5bc910da81fa37d6841c628264269dd5b174bde3aaaa0" gracePeriod=30 Nov 22 04:30:45 crc kubenswrapper[4699]: I1122 04:30:45.778722 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="205155e1-ef6e-43ce-9783-9abf9acd3a7b" containerName="nova-api-api" containerID="cri-o://ff69fe52ec6555f9a7e144cf85da4729085c535aebe38820b72c73cf7a0c6de7" gracePeriod=30 Nov 22 04:30:45 crc kubenswrapper[4699]: I1122 04:30:45.792773 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 04:30:45 crc kubenswrapper[4699]: I1122 04:30:45.792994 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="0dc7d0f2-e54f-4255-99e0-276a543227b1" containerName="nova-scheduler-scheduler" containerID="cri-o://e949861032a043e60f37b205279b56bcbdbf22c8b8b76d88f3828d9b2a87815a" gracePeriod=30 Nov 22 04:30:45 crc kubenswrapper[4699]: I1122 04:30:45.811622 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 04:30:45 crc kubenswrapper[4699]: I1122 04:30:45.811844 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d03f6427-651c-4de6-851f-a2961d706e99" containerName="nova-metadata-log" containerID="cri-o://cbc6c18c8ab51c74c196ee88bfa5b0099d42b8f3373297241d74a8059d98d955" gracePeriod=30 Nov 22 04:30:45 crc kubenswrapper[4699]: I1122 04:30:45.811993 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d03f6427-651c-4de6-851f-a2961d706e99" containerName="nova-metadata-metadata" containerID="cri-o://7b718a632f8c2933bd0b70f40b11df0e416eb21e6bd6ccc463cc705f917b169e" gracePeriod=30 Nov 22 04:30:46 crc kubenswrapper[4699]: E1122 04:30:46.425220 4699 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e949861032a043e60f37b205279b56bcbdbf22c8b8b76d88f3828d9b2a87815a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 22 04:30:46 crc kubenswrapper[4699]: E1122 04:30:46.429153 4699 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e949861032a043e60f37b205279b56bcbdbf22c8b8b76d88f3828d9b2a87815a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 22 04:30:46 crc kubenswrapper[4699]: E1122 04:30:46.433671 4699 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e949861032a043e60f37b205279b56bcbdbf22c8b8b76d88f3828d9b2a87815a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 22 04:30:46 crc kubenswrapper[4699]: E1122 04:30:46.433730 4699 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="0dc7d0f2-e54f-4255-99e0-276a543227b1" containerName="nova-scheduler-scheduler" Nov 22 04:30:46 crc kubenswrapper[4699]: I1122 04:30:46.633472 4699 generic.go:334] "Generic (PLEG): container finished" podID="205155e1-ef6e-43ce-9783-9abf9acd3a7b" containerID="ff69fe52ec6555f9a7e144cf85da4729085c535aebe38820b72c73cf7a0c6de7" exitCode=0 Nov 22 04:30:46 crc kubenswrapper[4699]: I1122 04:30:46.633512 4699 generic.go:334] "Generic (PLEG): container finished" podID="205155e1-ef6e-43ce-9783-9abf9acd3a7b" containerID="7cb2057f0f622116d8c5bc910da81fa37d6841c628264269dd5b174bde3aaaa0" exitCode=143 Nov 22 04:30:46 crc kubenswrapper[4699]: I1122 04:30:46.633557 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"205155e1-ef6e-43ce-9783-9abf9acd3a7b","Type":"ContainerDied","Data":"ff69fe52ec6555f9a7e144cf85da4729085c535aebe38820b72c73cf7a0c6de7"} Nov 22 04:30:46 crc kubenswrapper[4699]: I1122 04:30:46.633583 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"205155e1-ef6e-43ce-9783-9abf9acd3a7b","Type":"ContainerDied","Data":"7cb2057f0f622116d8c5bc910da81fa37d6841c628264269dd5b174bde3aaaa0"} Nov 22 04:30:46 crc kubenswrapper[4699]: I1122 04:30:46.635261 4699 generic.go:334] "Generic (PLEG): container finished" podID="d03f6427-651c-4de6-851f-a2961d706e99" containerID="cbc6c18c8ab51c74c196ee88bfa5b0099d42b8f3373297241d74a8059d98d955" exitCode=143 Nov 22 04:30:46 crc kubenswrapper[4699]: I1122 04:30:46.635284 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d03f6427-651c-4de6-851f-a2961d706e99","Type":"ContainerDied","Data":"cbc6c18c8ab51c74c196ee88bfa5b0099d42b8f3373297241d74a8059d98d955"} Nov 22 04:30:46 crc kubenswrapper[4699]: I1122 04:30:46.858543 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 04:30:46 crc kubenswrapper[4699]: I1122 04:30:46.979078 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjm6j\" (UniqueName: \"kubernetes.io/projected/205155e1-ef6e-43ce-9783-9abf9acd3a7b-kube-api-access-sjm6j\") pod \"205155e1-ef6e-43ce-9783-9abf9acd3a7b\" (UID: \"205155e1-ef6e-43ce-9783-9abf9acd3a7b\") " Nov 22 04:30:46 crc kubenswrapper[4699]: I1122 04:30:46.979147 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/205155e1-ef6e-43ce-9783-9abf9acd3a7b-internal-tls-certs\") pod \"205155e1-ef6e-43ce-9783-9abf9acd3a7b\" (UID: \"205155e1-ef6e-43ce-9783-9abf9acd3a7b\") " Nov 22 04:30:46 crc kubenswrapper[4699]: I1122 04:30:46.979247 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/205155e1-ef6e-43ce-9783-9abf9acd3a7b-logs\") pod \"205155e1-ef6e-43ce-9783-9abf9acd3a7b\" (UID: \"205155e1-ef6e-43ce-9783-9abf9acd3a7b\") " Nov 22 04:30:46 crc kubenswrapper[4699]: I1122 04:30:46.980198 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/205155e1-ef6e-43ce-9783-9abf9acd3a7b-config-data\") pod \"205155e1-ef6e-43ce-9783-9abf9acd3a7b\" (UID: \"205155e1-ef6e-43ce-9783-9abf9acd3a7b\") " Nov 22 04:30:46 crc kubenswrapper[4699]: I1122 04:30:46.980324 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/205155e1-ef6e-43ce-9783-9abf9acd3a7b-public-tls-certs\") pod \"205155e1-ef6e-43ce-9783-9abf9acd3a7b\" (UID: \"205155e1-ef6e-43ce-9783-9abf9acd3a7b\") " Nov 22 04:30:46 crc kubenswrapper[4699]: I1122 04:30:46.980395 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/205155e1-ef6e-43ce-9783-9abf9acd3a7b-combined-ca-bundle\") pod \"205155e1-ef6e-43ce-9783-9abf9acd3a7b\" (UID: \"205155e1-ef6e-43ce-9783-9abf9acd3a7b\") " Nov 22 04:30:46 crc kubenswrapper[4699]: I1122 04:30:46.980195 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/205155e1-ef6e-43ce-9783-9abf9acd3a7b-logs" (OuterVolumeSpecName: "logs") pod "205155e1-ef6e-43ce-9783-9abf9acd3a7b" (UID: "205155e1-ef6e-43ce-9783-9abf9acd3a7b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:30:46 crc kubenswrapper[4699]: I1122 04:30:46.985811 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/205155e1-ef6e-43ce-9783-9abf9acd3a7b-kube-api-access-sjm6j" (OuterVolumeSpecName: "kube-api-access-sjm6j") pod "205155e1-ef6e-43ce-9783-9abf9acd3a7b" (UID: "205155e1-ef6e-43ce-9783-9abf9acd3a7b"). InnerVolumeSpecName "kube-api-access-sjm6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:30:47 crc kubenswrapper[4699]: I1122 04:30:47.014366 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/205155e1-ef6e-43ce-9783-9abf9acd3a7b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "205155e1-ef6e-43ce-9783-9abf9acd3a7b" (UID: "205155e1-ef6e-43ce-9783-9abf9acd3a7b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:30:47 crc kubenswrapper[4699]: I1122 04:30:47.015127 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/205155e1-ef6e-43ce-9783-9abf9acd3a7b-config-data" (OuterVolumeSpecName: "config-data") pod "205155e1-ef6e-43ce-9783-9abf9acd3a7b" (UID: "205155e1-ef6e-43ce-9783-9abf9acd3a7b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:30:47 crc kubenswrapper[4699]: I1122 04:30:47.040135 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/205155e1-ef6e-43ce-9783-9abf9acd3a7b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "205155e1-ef6e-43ce-9783-9abf9acd3a7b" (UID: "205155e1-ef6e-43ce-9783-9abf9acd3a7b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:30:47 crc kubenswrapper[4699]: I1122 04:30:47.049850 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/205155e1-ef6e-43ce-9783-9abf9acd3a7b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "205155e1-ef6e-43ce-9783-9abf9acd3a7b" (UID: "205155e1-ef6e-43ce-9783-9abf9acd3a7b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:30:47 crc kubenswrapper[4699]: I1122 04:30:47.084042 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjm6j\" (UniqueName: \"kubernetes.io/projected/205155e1-ef6e-43ce-9783-9abf9acd3a7b-kube-api-access-sjm6j\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:47 crc kubenswrapper[4699]: I1122 04:30:47.084094 4699 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/205155e1-ef6e-43ce-9783-9abf9acd3a7b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:47 crc kubenswrapper[4699]: I1122 04:30:47.084111 4699 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/205155e1-ef6e-43ce-9783-9abf9acd3a7b-logs\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:47 crc kubenswrapper[4699]: I1122 04:30:47.084123 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/205155e1-ef6e-43ce-9783-9abf9acd3a7b-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:47 crc kubenswrapper[4699]: I1122 04:30:47.084136 4699 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/205155e1-ef6e-43ce-9783-9abf9acd3a7b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:47 crc kubenswrapper[4699]: I1122 04:30:47.084149 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/205155e1-ef6e-43ce-9783-9abf9acd3a7b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:47 crc kubenswrapper[4699]: I1122 04:30:47.646839 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"205155e1-ef6e-43ce-9783-9abf9acd3a7b","Type":"ContainerDied","Data":"0c5245f7a36b0a6ef946d72e5222bcf43441792fbd24a7791ed0db23435cf146"} Nov 22 04:30:47 crc kubenswrapper[4699]: I1122 04:30:47.646915 4699 scope.go:117] "RemoveContainer" containerID="ff69fe52ec6555f9a7e144cf85da4729085c535aebe38820b72c73cf7a0c6de7" Nov 22 04:30:47 crc kubenswrapper[4699]: I1122 04:30:47.646927 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 04:30:47 crc kubenswrapper[4699]: I1122 04:30:47.676002 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 22 04:30:47 crc kubenswrapper[4699]: I1122 04:30:47.681941 4699 scope.go:117] "RemoveContainer" containerID="7cb2057f0f622116d8c5bc910da81fa37d6841c628264269dd5b174bde3aaaa0" Nov 22 04:30:47 crc kubenswrapper[4699]: I1122 04:30:47.686197 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 22 04:30:47 crc kubenswrapper[4699]: I1122 04:30:47.705120 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 22 04:30:47 crc kubenswrapper[4699]: E1122 04:30:47.705482 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="205155e1-ef6e-43ce-9783-9abf9acd3a7b" containerName="nova-api-api" Nov 22 04:30:47 crc kubenswrapper[4699]: I1122 04:30:47.705499 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="205155e1-ef6e-43ce-9783-9abf9acd3a7b" containerName="nova-api-api" Nov 22 04:30:47 crc kubenswrapper[4699]: E1122 04:30:47.705515 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="205155e1-ef6e-43ce-9783-9abf9acd3a7b" containerName="nova-api-log" Nov 22 04:30:47 crc kubenswrapper[4699]: I1122 04:30:47.705522 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="205155e1-ef6e-43ce-9783-9abf9acd3a7b" containerName="nova-api-log" Nov 22 04:30:47 crc kubenswrapper[4699]: E1122 04:30:47.705536 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b73072a8-266e-4352-8f8a-371f64be1988" containerName="dnsmasq-dns" Nov 22 04:30:47 crc kubenswrapper[4699]: I1122 04:30:47.705542 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="b73072a8-266e-4352-8f8a-371f64be1988" containerName="dnsmasq-dns" Nov 22 04:30:47 crc kubenswrapper[4699]: E1122 04:30:47.705566 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b73072a8-266e-4352-8f8a-371f64be1988" containerName="init" Nov 22 04:30:47 crc kubenswrapper[4699]: I1122 04:30:47.705574 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="b73072a8-266e-4352-8f8a-371f64be1988" containerName="init" Nov 22 04:30:47 crc kubenswrapper[4699]: E1122 04:30:47.705587 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5273f2af-0355-484d-a907-589de1193a32" containerName="nova-manage" Nov 22 04:30:47 crc kubenswrapper[4699]: I1122 04:30:47.705592 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="5273f2af-0355-484d-a907-589de1193a32" containerName="nova-manage" Nov 22 04:30:47 crc kubenswrapper[4699]: I1122 04:30:47.705833 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="b73072a8-266e-4352-8f8a-371f64be1988" containerName="dnsmasq-dns" Nov 22 04:30:47 crc kubenswrapper[4699]: I1122 04:30:47.705847 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="205155e1-ef6e-43ce-9783-9abf9acd3a7b" containerName="nova-api-api" Nov 22 04:30:47 crc kubenswrapper[4699]: I1122 04:30:47.705860 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="205155e1-ef6e-43ce-9783-9abf9acd3a7b" containerName="nova-api-log" Nov 22 04:30:47 crc kubenswrapper[4699]: I1122 04:30:47.705873 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="5273f2af-0355-484d-a907-589de1193a32" containerName="nova-manage" Nov 22 04:30:47 crc kubenswrapper[4699]: I1122 04:30:47.707185 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 04:30:47 crc kubenswrapper[4699]: I1122 04:30:47.710055 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Nov 22 04:30:47 crc kubenswrapper[4699]: I1122 04:30:47.710305 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 22 04:30:47 crc kubenswrapper[4699]: I1122 04:30:47.714944 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Nov 22 04:30:47 crc kubenswrapper[4699]: I1122 04:30:47.722089 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 04:30:47 crc kubenswrapper[4699]: I1122 04:30:47.906788 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c614af4-7edb-4d51-9b42-5826d1cf656b-public-tls-certs\") pod \"nova-api-0\" (UID: \"6c614af4-7edb-4d51-9b42-5826d1cf656b\") " pod="openstack/nova-api-0" Nov 22 04:30:47 crc kubenswrapper[4699]: I1122 04:30:47.906870 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c614af4-7edb-4d51-9b42-5826d1cf656b-logs\") pod \"nova-api-0\" (UID: \"6c614af4-7edb-4d51-9b42-5826d1cf656b\") " pod="openstack/nova-api-0" Nov 22 04:30:47 crc kubenswrapper[4699]: I1122 04:30:47.906992 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfj9j\" (UniqueName: \"kubernetes.io/projected/6c614af4-7edb-4d51-9b42-5826d1cf656b-kube-api-access-kfj9j\") pod \"nova-api-0\" (UID: \"6c614af4-7edb-4d51-9b42-5826d1cf656b\") " pod="openstack/nova-api-0" Nov 22 04:30:47 crc kubenswrapper[4699]: I1122 04:30:47.907576 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c614af4-7edb-4d51-9b42-5826d1cf656b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6c614af4-7edb-4d51-9b42-5826d1cf656b\") " pod="openstack/nova-api-0" Nov 22 04:30:47 crc kubenswrapper[4699]: I1122 04:30:47.907620 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c614af4-7edb-4d51-9b42-5826d1cf656b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6c614af4-7edb-4d51-9b42-5826d1cf656b\") " pod="openstack/nova-api-0" Nov 22 04:30:47 crc kubenswrapper[4699]: I1122 04:30:47.907937 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c614af4-7edb-4d51-9b42-5826d1cf656b-config-data\") pod \"nova-api-0\" (UID: \"6c614af4-7edb-4d51-9b42-5826d1cf656b\") " pod="openstack/nova-api-0" Nov 22 04:30:48 crc kubenswrapper[4699]: I1122 04:30:48.011256 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c614af4-7edb-4d51-9b42-5826d1cf656b-public-tls-certs\") pod \"nova-api-0\" (UID: \"6c614af4-7edb-4d51-9b42-5826d1cf656b\") " pod="openstack/nova-api-0" Nov 22 04:30:48 crc kubenswrapper[4699]: I1122 04:30:48.011359 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c614af4-7edb-4d51-9b42-5826d1cf656b-logs\") pod \"nova-api-0\" (UID: \"6c614af4-7edb-4d51-9b42-5826d1cf656b\") " pod="openstack/nova-api-0" Nov 22 04:30:48 crc kubenswrapper[4699]: I1122 04:30:48.011581 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfj9j\" (UniqueName: \"kubernetes.io/projected/6c614af4-7edb-4d51-9b42-5826d1cf656b-kube-api-access-kfj9j\") pod \"nova-api-0\" (UID: \"6c614af4-7edb-4d51-9b42-5826d1cf656b\") " pod="openstack/nova-api-0" Nov 22 04:30:48 crc kubenswrapper[4699]: I1122 04:30:48.011650 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c614af4-7edb-4d51-9b42-5826d1cf656b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6c614af4-7edb-4d51-9b42-5826d1cf656b\") " pod="openstack/nova-api-0" Nov 22 04:30:48 crc kubenswrapper[4699]: I1122 04:30:48.011695 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c614af4-7edb-4d51-9b42-5826d1cf656b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6c614af4-7edb-4d51-9b42-5826d1cf656b\") " pod="openstack/nova-api-0" Nov 22 04:30:48 crc kubenswrapper[4699]: I1122 04:30:48.011853 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c614af4-7edb-4d51-9b42-5826d1cf656b-config-data\") pod \"nova-api-0\" (UID: \"6c614af4-7edb-4d51-9b42-5826d1cf656b\") " pod="openstack/nova-api-0" Nov 22 04:30:48 crc kubenswrapper[4699]: I1122 04:30:48.012103 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c614af4-7edb-4d51-9b42-5826d1cf656b-logs\") pod \"nova-api-0\" (UID: \"6c614af4-7edb-4d51-9b42-5826d1cf656b\") " pod="openstack/nova-api-0" Nov 22 04:30:48 crc kubenswrapper[4699]: I1122 04:30:48.016095 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c614af4-7edb-4d51-9b42-5826d1cf656b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6c614af4-7edb-4d51-9b42-5826d1cf656b\") " pod="openstack/nova-api-0" Nov 22 04:30:48 crc kubenswrapper[4699]: I1122 04:30:48.016108 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c614af4-7edb-4d51-9b42-5826d1cf656b-public-tls-certs\") pod \"nova-api-0\" (UID: \"6c614af4-7edb-4d51-9b42-5826d1cf656b\") " pod="openstack/nova-api-0" Nov 22 04:30:48 crc kubenswrapper[4699]: I1122 04:30:48.017668 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c614af4-7edb-4d51-9b42-5826d1cf656b-config-data\") pod \"nova-api-0\" (UID: \"6c614af4-7edb-4d51-9b42-5826d1cf656b\") " pod="openstack/nova-api-0" Nov 22 04:30:48 crc kubenswrapper[4699]: I1122 04:30:48.018211 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c614af4-7edb-4d51-9b42-5826d1cf656b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6c614af4-7edb-4d51-9b42-5826d1cf656b\") " pod="openstack/nova-api-0" Nov 22 04:30:48 crc kubenswrapper[4699]: I1122 04:30:48.040754 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfj9j\" (UniqueName: \"kubernetes.io/projected/6c614af4-7edb-4d51-9b42-5826d1cf656b-kube-api-access-kfj9j\") pod \"nova-api-0\" (UID: \"6c614af4-7edb-4d51-9b42-5826d1cf656b\") " pod="openstack/nova-api-0" Nov 22 04:30:48 crc kubenswrapper[4699]: I1122 04:30:48.042011 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 04:30:56 crc kubenswrapper[4699]: I1122 04:30:48.964620 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="d03f6427-651c-4de6-851f-a2961d706e99" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": read tcp 10.217.0.2:54424->10.217.0.202:8775: read: connection reset by peer" Nov 22 04:30:56 crc kubenswrapper[4699]: I1122 04:30:48.964636 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="d03f6427-651c-4de6-851f-a2961d706e99" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": read tcp 10.217.0.2:54426->10.217.0.202:8775: read: connection reset by peer" Nov 22 04:30:56 crc kubenswrapper[4699]: I1122 04:30:49.460174 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="205155e1-ef6e-43ce-9783-9abf9acd3a7b" path="/var/lib/kubelet/pods/205155e1-ef6e-43ce-9783-9abf9acd3a7b/volumes" Nov 22 04:30:56 crc kubenswrapper[4699]: I1122 04:30:49.725729 4699 generic.go:334] "Generic (PLEG): container finished" podID="d03f6427-651c-4de6-851f-a2961d706e99" containerID="7b718a632f8c2933bd0b70f40b11df0e416eb21e6bd6ccc463cc705f917b169e" exitCode=0 Nov 22 04:30:56 crc kubenswrapper[4699]: I1122 04:30:49.725811 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d03f6427-651c-4de6-851f-a2961d706e99","Type":"ContainerDied","Data":"7b718a632f8c2933bd0b70f40b11df0e416eb21e6bd6ccc463cc705f917b169e"} Nov 22 04:30:56 crc kubenswrapper[4699]: E1122 04:30:51.422917 4699 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e949861032a043e60f37b205279b56bcbdbf22c8b8b76d88f3828d9b2a87815a is running failed: container process not found" containerID="e949861032a043e60f37b205279b56bcbdbf22c8b8b76d88f3828d9b2a87815a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 22 04:30:56 crc kubenswrapper[4699]: E1122 04:30:51.423392 4699 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e949861032a043e60f37b205279b56bcbdbf22c8b8b76d88f3828d9b2a87815a is running failed: container process not found" containerID="e949861032a043e60f37b205279b56bcbdbf22c8b8b76d88f3828d9b2a87815a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 22 04:30:56 crc kubenswrapper[4699]: E1122 04:30:51.423712 4699 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e949861032a043e60f37b205279b56bcbdbf22c8b8b76d88f3828d9b2a87815a is running failed: container process not found" containerID="e949861032a043e60f37b205279b56bcbdbf22c8b8b76d88f3828d9b2a87815a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 22 04:30:56 crc kubenswrapper[4699]: E1122 04:30:51.423797 4699 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e949861032a043e60f37b205279b56bcbdbf22c8b8b76d88f3828d9b2a87815a is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="0dc7d0f2-e54f-4255-99e0-276a543227b1" containerName="nova-scheduler-scheduler" Nov 22 04:30:56 crc kubenswrapper[4699]: I1122 04:30:51.748869 4699 generic.go:334] "Generic (PLEG): container finished" podID="0dc7d0f2-e54f-4255-99e0-276a543227b1" containerID="e949861032a043e60f37b205279b56bcbdbf22c8b8b76d88f3828d9b2a87815a" exitCode=0 Nov 22 04:30:56 crc kubenswrapper[4699]: I1122 04:30:51.748989 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0dc7d0f2-e54f-4255-99e0-276a543227b1","Type":"ContainerDied","Data":"e949861032a043e60f37b205279b56bcbdbf22c8b8b76d88f3828d9b2a87815a"} Nov 22 04:30:56 crc kubenswrapper[4699]: I1122 04:30:53.467818 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="d03f6427-651c-4de6-851f-a2961d706e99" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": dial tcp 10.217.0.202:8775: connect: connection refused" Nov 22 04:30:56 crc kubenswrapper[4699]: I1122 04:30:53.468133 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="d03f6427-651c-4de6-851f-a2961d706e99" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": dial tcp 10.217.0.202:8775: connect: connection refused" Nov 22 04:30:56 crc kubenswrapper[4699]: E1122 04:30:56.423249 4699 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e949861032a043e60f37b205279b56bcbdbf22c8b8b76d88f3828d9b2a87815a is running failed: container process not found" containerID="e949861032a043e60f37b205279b56bcbdbf22c8b8b76d88f3828d9b2a87815a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 22 04:30:56 crc kubenswrapper[4699]: E1122 04:30:56.424451 4699 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e949861032a043e60f37b205279b56bcbdbf22c8b8b76d88f3828d9b2a87815a is running failed: container process not found" containerID="e949861032a043e60f37b205279b56bcbdbf22c8b8b76d88f3828d9b2a87815a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 22 04:30:56 crc kubenswrapper[4699]: E1122 04:30:56.424789 4699 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e949861032a043e60f37b205279b56bcbdbf22c8b8b76d88f3828d9b2a87815a is running failed: container process not found" containerID="e949861032a043e60f37b205279b56bcbdbf22c8b8b76d88f3828d9b2a87815a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 22 04:30:56 crc kubenswrapper[4699]: E1122 04:30:56.424823 4699 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e949861032a043e60f37b205279b56bcbdbf22c8b8b76d88f3828d9b2a87815a is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="0dc7d0f2-e54f-4255-99e0-276a543227b1" containerName="nova-scheduler-scheduler" Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.331868 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.333274 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.379354 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 04:30:57 crc kubenswrapper[4699]: W1122 04:30:57.381612 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c614af4_7edb_4d51_9b42_5826d1cf656b.slice/crio-caac57f553ccc3848d43726031c0ff250ca17f4034c4ea9ea34bb17cc9c206a4 WatchSource:0}: Error finding container caac57f553ccc3848d43726031c0ff250ca17f4034c4ea9ea34bb17cc9c206a4: Status 404 returned error can't find the container with id caac57f553ccc3848d43726031c0ff250ca17f4034c4ea9ea34bb17cc9c206a4 Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.410585 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d03f6427-651c-4de6-851f-a2961d706e99-nova-metadata-tls-certs\") pod \"d03f6427-651c-4de6-851f-a2961d706e99\" (UID: \"d03f6427-651c-4de6-851f-a2961d706e99\") " Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.410714 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d03f6427-651c-4de6-851f-a2961d706e99-logs\") pod \"d03f6427-651c-4de6-851f-a2961d706e99\" (UID: \"d03f6427-651c-4de6-851f-a2961d706e99\") " Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.410793 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x84x5\" (UniqueName: \"kubernetes.io/projected/0dc7d0f2-e54f-4255-99e0-276a543227b1-kube-api-access-x84x5\") pod \"0dc7d0f2-e54f-4255-99e0-276a543227b1\" (UID: \"0dc7d0f2-e54f-4255-99e0-276a543227b1\") " Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.419310 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d03f6427-651c-4de6-851f-a2961d706e99-logs" (OuterVolumeSpecName: "logs") pod "d03f6427-651c-4de6-851f-a2961d706e99" (UID: "d03f6427-651c-4de6-851f-a2961d706e99"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.422849 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dc7d0f2-e54f-4255-99e0-276a543227b1-combined-ca-bundle\") pod \"0dc7d0f2-e54f-4255-99e0-276a543227b1\" (UID: \"0dc7d0f2-e54f-4255-99e0-276a543227b1\") " Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.422973 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dc7d0f2-e54f-4255-99e0-276a543227b1-config-data\") pod \"0dc7d0f2-e54f-4255-99e0-276a543227b1\" (UID: \"0dc7d0f2-e54f-4255-99e0-276a543227b1\") " Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.423044 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d03f6427-651c-4de6-851f-a2961d706e99-combined-ca-bundle\") pod \"d03f6427-651c-4de6-851f-a2961d706e99\" (UID: \"d03f6427-651c-4de6-851f-a2961d706e99\") " Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.423074 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgdzl\" (UniqueName: \"kubernetes.io/projected/d03f6427-651c-4de6-851f-a2961d706e99-kube-api-access-vgdzl\") pod \"d03f6427-651c-4de6-851f-a2961d706e99\" (UID: \"d03f6427-651c-4de6-851f-a2961d706e99\") " Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.423215 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d03f6427-651c-4de6-851f-a2961d706e99-config-data\") pod \"d03f6427-651c-4de6-851f-a2961d706e99\" (UID: \"d03f6427-651c-4de6-851f-a2961d706e99\") " Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.424961 4699 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d03f6427-651c-4de6-851f-a2961d706e99-logs\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.425302 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dc7d0f2-e54f-4255-99e0-276a543227b1-kube-api-access-x84x5" (OuterVolumeSpecName: "kube-api-access-x84x5") pod "0dc7d0f2-e54f-4255-99e0-276a543227b1" (UID: "0dc7d0f2-e54f-4255-99e0-276a543227b1"). InnerVolumeSpecName "kube-api-access-x84x5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.428362 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d03f6427-651c-4de6-851f-a2961d706e99-kube-api-access-vgdzl" (OuterVolumeSpecName: "kube-api-access-vgdzl") pod "d03f6427-651c-4de6-851f-a2961d706e99" (UID: "d03f6427-651c-4de6-851f-a2961d706e99"). InnerVolumeSpecName "kube-api-access-vgdzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.458682 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dc7d0f2-e54f-4255-99e0-276a543227b1-config-data" (OuterVolumeSpecName: "config-data") pod "0dc7d0f2-e54f-4255-99e0-276a543227b1" (UID: "0dc7d0f2-e54f-4255-99e0-276a543227b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.476809 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d03f6427-651c-4de6-851f-a2961d706e99-config-data" (OuterVolumeSpecName: "config-data") pod "d03f6427-651c-4de6-851f-a2961d706e99" (UID: "d03f6427-651c-4de6-851f-a2961d706e99"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.477286 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dc7d0f2-e54f-4255-99e0-276a543227b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0dc7d0f2-e54f-4255-99e0-276a543227b1" (UID: "0dc7d0f2-e54f-4255-99e0-276a543227b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.485012 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d03f6427-651c-4de6-851f-a2961d706e99-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d03f6427-651c-4de6-851f-a2961d706e99" (UID: "d03f6427-651c-4de6-851f-a2961d706e99"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.498609 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d03f6427-651c-4de6-851f-a2961d706e99-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "d03f6427-651c-4de6-851f-a2961d706e99" (UID: "d03f6427-651c-4de6-851f-a2961d706e99"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.527150 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dc7d0f2-e54f-4255-99e0-276a543227b1-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.527183 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d03f6427-651c-4de6-851f-a2961d706e99-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.527218 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgdzl\" (UniqueName: \"kubernetes.io/projected/d03f6427-651c-4de6-851f-a2961d706e99-kube-api-access-vgdzl\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.527228 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d03f6427-651c-4de6-851f-a2961d706e99-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.527236 4699 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d03f6427-651c-4de6-851f-a2961d706e99-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.527244 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x84x5\" (UniqueName: \"kubernetes.io/projected/0dc7d0f2-e54f-4255-99e0-276a543227b1-kube-api-access-x84x5\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.527252 4699 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dc7d0f2-e54f-4255-99e0-276a543227b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.816301 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d03f6427-651c-4de6-851f-a2961d706e99","Type":"ContainerDied","Data":"dfa062d51e1fefd65fefd7a83b4cde4961e1fed65f9bf0ebe674bd43b4f8bcd6"} Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.816321 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.816736 4699 scope.go:117] "RemoveContainer" containerID="7b718a632f8c2933bd0b70f40b11df0e416eb21e6bd6ccc463cc705f917b169e" Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.818362 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6c614af4-7edb-4d51-9b42-5826d1cf656b","Type":"ContainerStarted","Data":"caac57f553ccc3848d43726031c0ff250ca17f4034c4ea9ea34bb17cc9c206a4"} Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.821050 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0dc7d0f2-e54f-4255-99e0-276a543227b1","Type":"ContainerDied","Data":"7d5240867b3476dec84a50b52728da5a86d07a17ac81b3673efd8caf5ad12db1"} Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.821137 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.844402 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.845819 4699 scope.go:117] "RemoveContainer" containerID="cbc6c18c8ab51c74c196ee88bfa5b0099d42b8f3373297241d74a8059d98d955" Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.866858 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.879992 4699 scope.go:117] "RemoveContainer" containerID="e949861032a043e60f37b205279b56bcbdbf22c8b8b76d88f3828d9b2a87815a" Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.884489 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.905530 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.922236 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 22 04:30:57 crc kubenswrapper[4699]: E1122 04:30:57.923869 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dc7d0f2-e54f-4255-99e0-276a543227b1" containerName="nova-scheduler-scheduler" Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.923985 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dc7d0f2-e54f-4255-99e0-276a543227b1" containerName="nova-scheduler-scheduler" Nov 22 04:30:57 crc kubenswrapper[4699]: E1122 04:30:57.924147 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d03f6427-651c-4de6-851f-a2961d706e99" containerName="nova-metadata-log" Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.924214 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="d03f6427-651c-4de6-851f-a2961d706e99" containerName="nova-metadata-log" Nov 22 04:30:57 crc kubenswrapper[4699]: E1122 04:30:57.924287 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d03f6427-651c-4de6-851f-a2961d706e99" containerName="nova-metadata-metadata" Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.924352 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="d03f6427-651c-4de6-851f-a2961d706e99" containerName="nova-metadata-metadata" Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.924741 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="d03f6427-651c-4de6-851f-a2961d706e99" containerName="nova-metadata-metadata" Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.924831 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dc7d0f2-e54f-4255-99e0-276a543227b1" containerName="nova-scheduler-scheduler" Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.924902 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="d03f6427-651c-4de6-851f-a2961d706e99" containerName="nova-metadata-log" Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.926664 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.931580 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.931595 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.937620 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.939874 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.944140 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.952700 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 04:30:57 crc kubenswrapper[4699]: I1122 04:30:57.967981 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 04:30:58 crc kubenswrapper[4699]: I1122 04:30:58.036870 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/da5d8c3b-bc84-4687-8f1d-c4763aba383c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"da5d8c3b-bc84-4687-8f1d-c4763aba383c\") " pod="openstack/nova-metadata-0" Nov 22 04:30:58 crc kubenswrapper[4699]: I1122 04:30:58.037194 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da5d8c3b-bc84-4687-8f1d-c4763aba383c-logs\") pod \"nova-metadata-0\" (UID: \"da5d8c3b-bc84-4687-8f1d-c4763aba383c\") " pod="openstack/nova-metadata-0" Nov 22 04:30:58 crc kubenswrapper[4699]: I1122 04:30:58.037233 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da5d8c3b-bc84-4687-8f1d-c4763aba383c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"da5d8c3b-bc84-4687-8f1d-c4763aba383c\") " pod="openstack/nova-metadata-0" Nov 22 04:30:58 crc kubenswrapper[4699]: I1122 04:30:58.037264 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntqb4\" (UniqueName: \"kubernetes.io/projected/da5d8c3b-bc84-4687-8f1d-c4763aba383c-kube-api-access-ntqb4\") pod \"nova-metadata-0\" (UID: \"da5d8c3b-bc84-4687-8f1d-c4763aba383c\") " pod="openstack/nova-metadata-0" Nov 22 04:30:58 crc kubenswrapper[4699]: I1122 04:30:58.037297 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bdcc9f1-da80-479b-b5d2-f4487ed993c7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6bdcc9f1-da80-479b-b5d2-f4487ed993c7\") " pod="openstack/nova-scheduler-0" Nov 22 04:30:58 crc kubenswrapper[4699]: I1122 04:30:58.037319 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da5d8c3b-bc84-4687-8f1d-c4763aba383c-config-data\") pod \"nova-metadata-0\" (UID: \"da5d8c3b-bc84-4687-8f1d-c4763aba383c\") " pod="openstack/nova-metadata-0" Nov 22 04:30:58 crc kubenswrapper[4699]: I1122 04:30:58.037465 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98zfb\" (UniqueName: \"kubernetes.io/projected/6bdcc9f1-da80-479b-b5d2-f4487ed993c7-kube-api-access-98zfb\") pod \"nova-scheduler-0\" (UID: \"6bdcc9f1-da80-479b-b5d2-f4487ed993c7\") " pod="openstack/nova-scheduler-0" Nov 22 04:30:58 crc kubenswrapper[4699]: I1122 04:30:58.037776 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bdcc9f1-da80-479b-b5d2-f4487ed993c7-config-data\") pod \"nova-scheduler-0\" (UID: \"6bdcc9f1-da80-479b-b5d2-f4487ed993c7\") " pod="openstack/nova-scheduler-0" Nov 22 04:30:58 crc kubenswrapper[4699]: I1122 04:30:58.138917 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da5d8c3b-bc84-4687-8f1d-c4763aba383c-logs\") pod \"nova-metadata-0\" (UID: \"da5d8c3b-bc84-4687-8f1d-c4763aba383c\") " pod="openstack/nova-metadata-0" Nov 22 04:30:58 crc kubenswrapper[4699]: I1122 04:30:58.138981 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da5d8c3b-bc84-4687-8f1d-c4763aba383c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"da5d8c3b-bc84-4687-8f1d-c4763aba383c\") " pod="openstack/nova-metadata-0" Nov 22 04:30:58 crc kubenswrapper[4699]: I1122 04:30:58.139018 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntqb4\" (UniqueName: \"kubernetes.io/projected/da5d8c3b-bc84-4687-8f1d-c4763aba383c-kube-api-access-ntqb4\") pod \"nova-metadata-0\" (UID: \"da5d8c3b-bc84-4687-8f1d-c4763aba383c\") " pod="openstack/nova-metadata-0" Nov 22 04:30:58 crc kubenswrapper[4699]: I1122 04:30:58.139052 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bdcc9f1-da80-479b-b5d2-f4487ed993c7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6bdcc9f1-da80-479b-b5d2-f4487ed993c7\") " pod="openstack/nova-scheduler-0" Nov 22 04:30:58 crc kubenswrapper[4699]: I1122 04:30:58.139076 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da5d8c3b-bc84-4687-8f1d-c4763aba383c-config-data\") pod \"nova-metadata-0\" (UID: \"da5d8c3b-bc84-4687-8f1d-c4763aba383c\") " pod="openstack/nova-metadata-0" Nov 22 04:30:58 crc kubenswrapper[4699]: I1122 04:30:58.139104 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98zfb\" (UniqueName: \"kubernetes.io/projected/6bdcc9f1-da80-479b-b5d2-f4487ed993c7-kube-api-access-98zfb\") pod \"nova-scheduler-0\" (UID: \"6bdcc9f1-da80-479b-b5d2-f4487ed993c7\") " pod="openstack/nova-scheduler-0" Nov 22 04:30:58 crc kubenswrapper[4699]: I1122 04:30:58.139144 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bdcc9f1-da80-479b-b5d2-f4487ed993c7-config-data\") pod \"nova-scheduler-0\" (UID: \"6bdcc9f1-da80-479b-b5d2-f4487ed993c7\") " pod="openstack/nova-scheduler-0" Nov 22 04:30:58 crc kubenswrapper[4699]: I1122 04:30:58.139167 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/da5d8c3b-bc84-4687-8f1d-c4763aba383c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"da5d8c3b-bc84-4687-8f1d-c4763aba383c\") " pod="openstack/nova-metadata-0" Nov 22 04:30:58 crc kubenswrapper[4699]: I1122 04:30:58.139556 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da5d8c3b-bc84-4687-8f1d-c4763aba383c-logs\") pod \"nova-metadata-0\" (UID: \"da5d8c3b-bc84-4687-8f1d-c4763aba383c\") " pod="openstack/nova-metadata-0" Nov 22 04:30:58 crc kubenswrapper[4699]: I1122 04:30:58.143303 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/da5d8c3b-bc84-4687-8f1d-c4763aba383c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"da5d8c3b-bc84-4687-8f1d-c4763aba383c\") " pod="openstack/nova-metadata-0" Nov 22 04:30:58 crc kubenswrapper[4699]: I1122 04:30:58.143524 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da5d8c3b-bc84-4687-8f1d-c4763aba383c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"da5d8c3b-bc84-4687-8f1d-c4763aba383c\") " pod="openstack/nova-metadata-0" Nov 22 04:30:58 crc kubenswrapper[4699]: I1122 04:30:58.143811 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bdcc9f1-da80-479b-b5d2-f4487ed993c7-config-data\") pod \"nova-scheduler-0\" (UID: \"6bdcc9f1-da80-479b-b5d2-f4487ed993c7\") " pod="openstack/nova-scheduler-0" Nov 22 04:30:58 crc kubenswrapper[4699]: I1122 04:30:58.146236 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da5d8c3b-bc84-4687-8f1d-c4763aba383c-config-data\") pod \"nova-metadata-0\" (UID: \"da5d8c3b-bc84-4687-8f1d-c4763aba383c\") " pod="openstack/nova-metadata-0" Nov 22 04:30:58 crc kubenswrapper[4699]: I1122 04:30:58.148740 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bdcc9f1-da80-479b-b5d2-f4487ed993c7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6bdcc9f1-da80-479b-b5d2-f4487ed993c7\") " pod="openstack/nova-scheduler-0" Nov 22 04:30:58 crc kubenswrapper[4699]: I1122 04:30:58.155576 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98zfb\" (UniqueName: \"kubernetes.io/projected/6bdcc9f1-da80-479b-b5d2-f4487ed993c7-kube-api-access-98zfb\") pod \"nova-scheduler-0\" (UID: \"6bdcc9f1-da80-479b-b5d2-f4487ed993c7\") " pod="openstack/nova-scheduler-0" Nov 22 04:30:58 crc kubenswrapper[4699]: I1122 04:30:58.157620 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntqb4\" (UniqueName: \"kubernetes.io/projected/da5d8c3b-bc84-4687-8f1d-c4763aba383c-kube-api-access-ntqb4\") pod \"nova-metadata-0\" (UID: \"da5d8c3b-bc84-4687-8f1d-c4763aba383c\") " pod="openstack/nova-metadata-0" Nov 22 04:30:58 crc kubenswrapper[4699]: I1122 04:30:58.265625 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 04:30:58 crc kubenswrapper[4699]: I1122 04:30:58.268680 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 04:30:58 crc kubenswrapper[4699]: I1122 04:30:58.731319 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 04:30:58 crc kubenswrapper[4699]: I1122 04:30:58.746207 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 04:30:58 crc kubenswrapper[4699]: I1122 04:30:58.838898 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6bdcc9f1-da80-479b-b5d2-f4487ed993c7","Type":"ContainerStarted","Data":"f5f0bcc9fb1fc7e9cdb512b8416522381e436c649e0dfe23c5dbfc9ec71cccf9"} Nov 22 04:30:58 crc kubenswrapper[4699]: I1122 04:30:58.843336 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6c614af4-7edb-4d51-9b42-5826d1cf656b","Type":"ContainerStarted","Data":"c3ea4dff6e63559868247069462162ed6ad012ee3fd1bac26f07f78b530937e6"} Nov 22 04:30:58 crc kubenswrapper[4699]: I1122 04:30:58.846268 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"da5d8c3b-bc84-4687-8f1d-c4763aba383c","Type":"ContainerStarted","Data":"b431b48dedfa735b2c9848188988587ee4aed3eb7e00f95a6ddb8956b7e74b02"} Nov 22 04:30:59 crc kubenswrapper[4699]: I1122 04:30:59.463809 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dc7d0f2-e54f-4255-99e0-276a543227b1" path="/var/lib/kubelet/pods/0dc7d0f2-e54f-4255-99e0-276a543227b1/volumes" Nov 22 04:30:59 crc kubenswrapper[4699]: I1122 04:30:59.464873 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d03f6427-651c-4de6-851f-a2961d706e99" path="/var/lib/kubelet/pods/d03f6427-651c-4de6-851f-a2961d706e99/volumes" Nov 22 04:30:59 crc kubenswrapper[4699]: I1122 04:30:59.857792 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"da5d8c3b-bc84-4687-8f1d-c4763aba383c","Type":"ContainerStarted","Data":"752619d284aa14ecda0cce62c9078d74d2ef7e3cc671e9a02d2b06143f6088c4"} Nov 22 04:30:59 crc kubenswrapper[4699]: I1122 04:30:59.859353 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6bdcc9f1-da80-479b-b5d2-f4487ed993c7","Type":"ContainerStarted","Data":"174d7502faeccbc3a1ad1d76200610b0df63c8db42eb9d9bc3db0b0f99a17fc1"} Nov 22 04:30:59 crc kubenswrapper[4699]: I1122 04:30:59.861816 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6c614af4-7edb-4d51-9b42-5826d1cf656b","Type":"ContainerStarted","Data":"d544210b576bd93a42315cc4907d8349d1029c90f42d527346cff58b3fb0fb8a"} Nov 22 04:30:59 crc kubenswrapper[4699]: I1122 04:30:59.918790 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.91876886 podStartE2EDuration="2.91876886s" podCreationTimestamp="2025-11-22 04:30:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:30:59.875021127 +0000 UTC m=+1411.217642334" watchObservedRunningTime="2025-11-22 04:30:59.91876886 +0000 UTC m=+1411.261390047" Nov 22 04:30:59 crc kubenswrapper[4699]: I1122 04:30:59.927162 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=12.927144337 podStartE2EDuration="12.927144337s" podCreationTimestamp="2025-11-22 04:30:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:30:59.907611166 +0000 UTC m=+1411.250232353" watchObservedRunningTime="2025-11-22 04:30:59.927144337 +0000 UTC m=+1411.269765514" Nov 22 04:31:00 crc kubenswrapper[4699]: I1122 04:31:00.872820 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"da5d8c3b-bc84-4687-8f1d-c4763aba383c","Type":"ContainerStarted","Data":"4079ded2fbe3325614032341439db1b7e2d11bc6a678974305b5dc1c02ab3875"} Nov 22 04:31:00 crc kubenswrapper[4699]: I1122 04:31:00.890427 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.890401284 podStartE2EDuration="3.890401284s" podCreationTimestamp="2025-11-22 04:30:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:31:00.887910286 +0000 UTC m=+1412.230531493" watchObservedRunningTime="2025-11-22 04:31:00.890401284 +0000 UTC m=+1412.233022491" Nov 22 04:31:03 crc kubenswrapper[4699]: I1122 04:31:03.265697 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 22 04:31:03 crc kubenswrapper[4699]: I1122 04:31:03.266287 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 22 04:31:03 crc kubenswrapper[4699]: I1122 04:31:03.268760 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 22 04:31:04 crc kubenswrapper[4699]: I1122 04:31:04.848590 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 22 04:31:08 crc kubenswrapper[4699]: I1122 04:31:08.043209 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 22 04:31:08 crc kubenswrapper[4699]: I1122 04:31:08.044727 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 22 04:31:08 crc kubenswrapper[4699]: I1122 04:31:08.266053 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 22 04:31:08 crc kubenswrapper[4699]: I1122 04:31:08.266097 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 22 04:31:08 crc kubenswrapper[4699]: I1122 04:31:08.269794 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 22 04:31:08 crc kubenswrapper[4699]: I1122 04:31:08.301701 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 22 04:31:08 crc kubenswrapper[4699]: I1122 04:31:08.725662 4699 patch_prober.go:28] interesting pod/machine-config-daemon-kjwnt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:31:08 crc kubenswrapper[4699]: I1122 04:31:08.725720 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:31:08 crc kubenswrapper[4699]: I1122 04:31:08.987483 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 22 04:31:09 crc kubenswrapper[4699]: I1122 04:31:09.055631 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6c614af4-7edb-4d51-9b42-5826d1cf656b" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.212:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 22 04:31:09 crc kubenswrapper[4699]: I1122 04:31:09.055686 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6c614af4-7edb-4d51-9b42-5826d1cf656b" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.212:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 22 04:31:09 crc kubenswrapper[4699]: I1122 04:31:09.290628 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="da5d8c3b-bc84-4687-8f1d-c4763aba383c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.213:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 22 04:31:09 crc kubenswrapper[4699]: I1122 04:31:09.290651 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="da5d8c3b-bc84-4687-8f1d-c4763aba383c" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.213:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 22 04:31:18 crc kubenswrapper[4699]: I1122 04:31:18.042742 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 22 04:31:18 crc kubenswrapper[4699]: I1122 04:31:18.043384 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 22 04:31:18 crc kubenswrapper[4699]: I1122 04:31:18.049874 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 22 04:31:18 crc kubenswrapper[4699]: I1122 04:31:18.050474 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 22 04:31:18 crc kubenswrapper[4699]: I1122 04:31:18.273779 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 22 04:31:18 crc kubenswrapper[4699]: I1122 04:31:18.273893 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 22 04:31:18 crc kubenswrapper[4699]: I1122 04:31:18.280715 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 22 04:31:18 crc kubenswrapper[4699]: I1122 04:31:18.280981 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 22 04:31:19 crc kubenswrapper[4699]: I1122 04:31:19.053732 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 22 04:31:19 crc kubenswrapper[4699]: I1122 04:31:19.054269 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 22 04:31:28 crc kubenswrapper[4699]: I1122 04:31:28.095208 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 04:31:28 crc kubenswrapper[4699]: I1122 04:31:28.978658 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 04:31:31 crc kubenswrapper[4699]: I1122 04:31:31.827003 4699 scope.go:117] "RemoveContainer" containerID="522cfd2e77f901493bb0502a1be8ae64f66af845bbc9f8a50c9da6781070b0c1" Nov 22 04:31:32 crc kubenswrapper[4699]: I1122 04:31:32.394105 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="964a7a4a-f709-43ea-85f2-93a8273d503d" containerName="rabbitmq" containerID="cri-o://bb13455a4929110bd93e56258064b66566763c83228e0e788449171fda6f4857" gracePeriod=604796 Nov 22 04:31:33 crc kubenswrapper[4699]: I1122 04:31:33.074389 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="43d42bf1-de55-49eb-990f-451ad31d0e21" containerName="rabbitmq" containerID="cri-o://46c87d2a62fb4ca084c1e3345da9baf035ce1b5a33e1a4c13e4dfe636abd19e2" gracePeriod=604796 Nov 22 04:31:33 crc kubenswrapper[4699]: I1122 04:31:33.418909 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="964a7a4a-f709-43ea-85f2-93a8273d503d" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Nov 22 04:31:33 crc kubenswrapper[4699]: I1122 04:31:33.724810 4699 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="43d42bf1-de55-49eb-990f-451ad31d0e21" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.100:5671: connect: connection refused" Nov 22 04:31:38 crc kubenswrapper[4699]: I1122 04:31:38.726467 4699 patch_prober.go:28] interesting pod/machine-config-daemon-kjwnt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:31:38 crc kubenswrapper[4699]: I1122 04:31:38.727045 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:31:38 crc kubenswrapper[4699]: I1122 04:31:38.994565 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.109231 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/964a7a4a-f709-43ea-85f2-93a8273d503d-rabbitmq-tls\") pod \"964a7a4a-f709-43ea-85f2-93a8273d503d\" (UID: \"964a7a4a-f709-43ea-85f2-93a8273d503d\") " Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.109310 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"964a7a4a-f709-43ea-85f2-93a8273d503d\" (UID: \"964a7a4a-f709-43ea-85f2-93a8273d503d\") " Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.109335 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/964a7a4a-f709-43ea-85f2-93a8273d503d-config-data\") pod \"964a7a4a-f709-43ea-85f2-93a8273d503d\" (UID: \"964a7a4a-f709-43ea-85f2-93a8273d503d\") " Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.109356 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/964a7a4a-f709-43ea-85f2-93a8273d503d-erlang-cookie-secret\") pod \"964a7a4a-f709-43ea-85f2-93a8273d503d\" (UID: \"964a7a4a-f709-43ea-85f2-93a8273d503d\") " Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.109378 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/964a7a4a-f709-43ea-85f2-93a8273d503d-rabbitmq-confd\") pod \"964a7a4a-f709-43ea-85f2-93a8273d503d\" (UID: \"964a7a4a-f709-43ea-85f2-93a8273d503d\") " Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.109784 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/964a7a4a-f709-43ea-85f2-93a8273d503d-pod-info\") pod \"964a7a4a-f709-43ea-85f2-93a8273d503d\" (UID: \"964a7a4a-f709-43ea-85f2-93a8273d503d\") " Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.110258 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/964a7a4a-f709-43ea-85f2-93a8273d503d-rabbitmq-erlang-cookie\") pod \"964a7a4a-f709-43ea-85f2-93a8273d503d\" (UID: \"964a7a4a-f709-43ea-85f2-93a8273d503d\") " Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.110854 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/964a7a4a-f709-43ea-85f2-93a8273d503d-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "964a7a4a-f709-43ea-85f2-93a8273d503d" (UID: "964a7a4a-f709-43ea-85f2-93a8273d503d"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.111174 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/964a7a4a-f709-43ea-85f2-93a8273d503d-server-conf\") pod \"964a7a4a-f709-43ea-85f2-93a8273d503d\" (UID: \"964a7a4a-f709-43ea-85f2-93a8273d503d\") " Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.111298 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/964a7a4a-f709-43ea-85f2-93a8273d503d-plugins-conf\") pod \"964a7a4a-f709-43ea-85f2-93a8273d503d\" (UID: \"964a7a4a-f709-43ea-85f2-93a8273d503d\") " Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.111372 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/964a7a4a-f709-43ea-85f2-93a8273d503d-rabbitmq-plugins\") pod \"964a7a4a-f709-43ea-85f2-93a8273d503d\" (UID: \"964a7a4a-f709-43ea-85f2-93a8273d503d\") " Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.111396 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdzj9\" (UniqueName: \"kubernetes.io/projected/964a7a4a-f709-43ea-85f2-93a8273d503d-kube-api-access-xdzj9\") pod \"964a7a4a-f709-43ea-85f2-93a8273d503d\" (UID: \"964a7a4a-f709-43ea-85f2-93a8273d503d\") " Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.112019 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/964a7a4a-f709-43ea-85f2-93a8273d503d-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "964a7a4a-f709-43ea-85f2-93a8273d503d" (UID: "964a7a4a-f709-43ea-85f2-93a8273d503d"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.112451 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/964a7a4a-f709-43ea-85f2-93a8273d503d-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "964a7a4a-f709-43ea-85f2-93a8273d503d" (UID: "964a7a4a-f709-43ea-85f2-93a8273d503d"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.112835 4699 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/964a7a4a-f709-43ea-85f2-93a8273d503d-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.112856 4699 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/964a7a4a-f709-43ea-85f2-93a8273d503d-plugins-conf\") on node \"crc\" DevicePath \"\"" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.112869 4699 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/964a7a4a-f709-43ea-85f2-93a8273d503d-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.118151 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/964a7a4a-f709-43ea-85f2-93a8273d503d-kube-api-access-xdzj9" (OuterVolumeSpecName: "kube-api-access-xdzj9") pod "964a7a4a-f709-43ea-85f2-93a8273d503d" (UID: "964a7a4a-f709-43ea-85f2-93a8273d503d"). InnerVolumeSpecName "kube-api-access-xdzj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.119996 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/964a7a4a-f709-43ea-85f2-93a8273d503d-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "964a7a4a-f709-43ea-85f2-93a8273d503d" (UID: "964a7a4a-f709-43ea-85f2-93a8273d503d"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.120733 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "964a7a4a-f709-43ea-85f2-93a8273d503d" (UID: "964a7a4a-f709-43ea-85f2-93a8273d503d"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.135721 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/964a7a4a-f709-43ea-85f2-93a8273d503d-pod-info" (OuterVolumeSpecName: "pod-info") pod "964a7a4a-f709-43ea-85f2-93a8273d503d" (UID: "964a7a4a-f709-43ea-85f2-93a8273d503d"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.137628 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/964a7a4a-f709-43ea-85f2-93a8273d503d-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "964a7a4a-f709-43ea-85f2-93a8273d503d" (UID: "964a7a4a-f709-43ea-85f2-93a8273d503d"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.156079 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/964a7a4a-f709-43ea-85f2-93a8273d503d-config-data" (OuterVolumeSpecName: "config-data") pod "964a7a4a-f709-43ea-85f2-93a8273d503d" (UID: "964a7a4a-f709-43ea-85f2-93a8273d503d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.192165 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/964a7a4a-f709-43ea-85f2-93a8273d503d-server-conf" (OuterVolumeSpecName: "server-conf") pod "964a7a4a-f709-43ea-85f2-93a8273d503d" (UID: "964a7a4a-f709-43ea-85f2-93a8273d503d"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.215396 4699 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/964a7a4a-f709-43ea-85f2-93a8273d503d-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.215444 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/964a7a4a-f709-43ea-85f2-93a8273d503d-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.215472 4699 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.215483 4699 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/964a7a4a-f709-43ea-85f2-93a8273d503d-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.215491 4699 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/964a7a4a-f709-43ea-85f2-93a8273d503d-pod-info\") on node \"crc\" DevicePath \"\"" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.215500 4699 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/964a7a4a-f709-43ea-85f2-93a8273d503d-server-conf\") on node \"crc\" DevicePath \"\"" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.215510 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdzj9\" (UniqueName: \"kubernetes.io/projected/964a7a4a-f709-43ea-85f2-93a8273d503d-kube-api-access-xdzj9\") on node \"crc\" DevicePath \"\"" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.232977 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/964a7a4a-f709-43ea-85f2-93a8273d503d-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "964a7a4a-f709-43ea-85f2-93a8273d503d" (UID: "964a7a4a-f709-43ea-85f2-93a8273d503d"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.235862 4699 generic.go:334] "Generic (PLEG): container finished" podID="964a7a4a-f709-43ea-85f2-93a8273d503d" containerID="bb13455a4929110bd93e56258064b66566763c83228e0e788449171fda6f4857" exitCode=0 Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.235905 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.235913 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"964a7a4a-f709-43ea-85f2-93a8273d503d","Type":"ContainerDied","Data":"bb13455a4929110bd93e56258064b66566763c83228e0e788449171fda6f4857"} Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.235946 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"964a7a4a-f709-43ea-85f2-93a8273d503d","Type":"ContainerDied","Data":"851562825c295b58e1efb42ba4c22c75ff571041dfc856a2c7150d6f21a2b299"} Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.235964 4699 scope.go:117] "RemoveContainer" containerID="bb13455a4929110bd93e56258064b66566763c83228e0e788449171fda6f4857" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.257116 4699 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.312180 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.317915 4699 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.317960 4699 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/964a7a4a-f709-43ea-85f2-93a8273d503d-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.325259 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.345471 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 04:31:39 crc kubenswrapper[4699]: E1122 04:31:39.345987 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="964a7a4a-f709-43ea-85f2-93a8273d503d" containerName="setup-container" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.346009 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="964a7a4a-f709-43ea-85f2-93a8273d503d" containerName="setup-container" Nov 22 04:31:39 crc kubenswrapper[4699]: E1122 04:31:39.346025 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="964a7a4a-f709-43ea-85f2-93a8273d503d" containerName="rabbitmq" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.346033 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="964a7a4a-f709-43ea-85f2-93a8273d503d" containerName="rabbitmq" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.346266 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="964a7a4a-f709-43ea-85f2-93a8273d503d" containerName="rabbitmq" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.347549 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.349953 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.350181 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.350336 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-vbc4v" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.350546 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.351423 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.351592 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.351918 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.385721 4699 scope.go:117] "RemoveContainer" containerID="922cc6338789229c693e736b86854e08f2e6f96b709403c471b4ab8e464ef1b4" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.389531 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.426597 4699 scope.go:117] "RemoveContainer" containerID="bb13455a4929110bd93e56258064b66566763c83228e0e788449171fda6f4857" Nov 22 04:31:39 crc kubenswrapper[4699]: E1122 04:31:39.427280 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb13455a4929110bd93e56258064b66566763c83228e0e788449171fda6f4857\": container with ID starting with bb13455a4929110bd93e56258064b66566763c83228e0e788449171fda6f4857 not found: ID does not exist" containerID="bb13455a4929110bd93e56258064b66566763c83228e0e788449171fda6f4857" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.427334 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb13455a4929110bd93e56258064b66566763c83228e0e788449171fda6f4857"} err="failed to get container status \"bb13455a4929110bd93e56258064b66566763c83228e0e788449171fda6f4857\": rpc error: code = NotFound desc = could not find container \"bb13455a4929110bd93e56258064b66566763c83228e0e788449171fda6f4857\": container with ID starting with bb13455a4929110bd93e56258064b66566763c83228e0e788449171fda6f4857 not found: ID does not exist" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.427834 4699 scope.go:117] "RemoveContainer" containerID="922cc6338789229c693e736b86854e08f2e6f96b709403c471b4ab8e464ef1b4" Nov 22 04:31:39 crc kubenswrapper[4699]: E1122 04:31:39.428138 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"922cc6338789229c693e736b86854e08f2e6f96b709403c471b4ab8e464ef1b4\": container with ID starting with 922cc6338789229c693e736b86854e08f2e6f96b709403c471b4ab8e464ef1b4 not found: ID does not exist" containerID="922cc6338789229c693e736b86854e08f2e6f96b709403c471b4ab8e464ef1b4" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.428231 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"922cc6338789229c693e736b86854e08f2e6f96b709403c471b4ab8e464ef1b4"} err="failed to get container status \"922cc6338789229c693e736b86854e08f2e6f96b709403c471b4ab8e464ef1b4\": rpc error: code = NotFound desc = could not find container \"922cc6338789229c693e736b86854e08f2e6f96b709403c471b4ab8e464ef1b4\": container with ID starting with 922cc6338789229c693e736b86854e08f2e6f96b709403c471b4ab8e464ef1b4 not found: ID does not exist" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.461551 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="964a7a4a-f709-43ea-85f2-93a8273d503d" path="/var/lib/kubelet/pods/964a7a4a-f709-43ea-85f2-93a8273d503d/volumes" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.521978 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8\") " pod="openstack/rabbitmq-server-0" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.522034 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8\") " pod="openstack/rabbitmq-server-0" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.522068 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8\") " pod="openstack/rabbitmq-server-0" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.522140 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8\") " pod="openstack/rabbitmq-server-0" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.522164 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8-config-data\") pod \"rabbitmq-server-0\" (UID: \"2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8\") " pod="openstack/rabbitmq-server-0" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.522208 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpt6d\" (UniqueName: \"kubernetes.io/projected/2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8-kube-api-access-mpt6d\") pod \"rabbitmq-server-0\" (UID: \"2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8\") " pod="openstack/rabbitmq-server-0" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.522276 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8\") " pod="openstack/rabbitmq-server-0" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.522318 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8\") " pod="openstack/rabbitmq-server-0" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.522367 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8\") " pod="openstack/rabbitmq-server-0" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.522446 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8\") " pod="openstack/rabbitmq-server-0" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.522486 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8\") " pod="openstack/rabbitmq-server-0" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.624152 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8\") " pod="openstack/rabbitmq-server-0" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.624200 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8\") " pod="openstack/rabbitmq-server-0" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.624229 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8\") " pod="openstack/rabbitmq-server-0" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.624287 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8\") " pod="openstack/rabbitmq-server-0" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.624304 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8-config-data\") pod \"rabbitmq-server-0\" (UID: \"2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8\") " pod="openstack/rabbitmq-server-0" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.624325 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpt6d\" (UniqueName: \"kubernetes.io/projected/2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8-kube-api-access-mpt6d\") pod \"rabbitmq-server-0\" (UID: \"2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8\") " pod="openstack/rabbitmq-server-0" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.624351 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8\") " pod="openstack/rabbitmq-server-0" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.624382 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8\") " pod="openstack/rabbitmq-server-0" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.624414 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8\") " pod="openstack/rabbitmq-server-0" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.624463 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8\") " pod="openstack/rabbitmq-server-0" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.624482 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8\") " pod="openstack/rabbitmq-server-0" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.625999 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8-config-data\") pod \"rabbitmq-server-0\" (UID: \"2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8\") " pod="openstack/rabbitmq-server-0" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.626707 4699 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.626951 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8\") " pod="openstack/rabbitmq-server-0" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.627139 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8\") " pod="openstack/rabbitmq-server-0" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.627857 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8\") " pod="openstack/rabbitmq-server-0" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.628086 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8\") " pod="openstack/rabbitmq-server-0" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.629584 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8\") " pod="openstack/rabbitmq-server-0" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.630545 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8\") " pod="openstack/rabbitmq-server-0" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.631132 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8\") " pod="openstack/rabbitmq-server-0" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.640499 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8\") " pod="openstack/rabbitmq-server-0" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.645514 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpt6d\" (UniqueName: \"kubernetes.io/projected/2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8-kube-api-access-mpt6d\") pod \"rabbitmq-server-0\" (UID: \"2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8\") " pod="openstack/rabbitmq-server-0" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.678189 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8\") " pod="openstack/rabbitmq-server-0" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.772418 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.940252 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/43d42bf1-de55-49eb-990f-451ad31d0e21-rabbitmq-confd\") pod \"43d42bf1-de55-49eb-990f-451ad31d0e21\" (UID: \"43d42bf1-de55-49eb-990f-451ad31d0e21\") " Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.940329 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/43d42bf1-de55-49eb-990f-451ad31d0e21-erlang-cookie-secret\") pod \"43d42bf1-de55-49eb-990f-451ad31d0e21\" (UID: \"43d42bf1-de55-49eb-990f-451ad31d0e21\") " Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.940372 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/43d42bf1-de55-49eb-990f-451ad31d0e21-rabbitmq-plugins\") pod \"43d42bf1-de55-49eb-990f-451ad31d0e21\" (UID: \"43d42bf1-de55-49eb-990f-451ad31d0e21\") " Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.940417 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/43d42bf1-de55-49eb-990f-451ad31d0e21-config-data\") pod \"43d42bf1-de55-49eb-990f-451ad31d0e21\" (UID: \"43d42bf1-de55-49eb-990f-451ad31d0e21\") " Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.940618 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pql4w\" (UniqueName: \"kubernetes.io/projected/43d42bf1-de55-49eb-990f-451ad31d0e21-kube-api-access-pql4w\") pod \"43d42bf1-de55-49eb-990f-451ad31d0e21\" (UID: \"43d42bf1-de55-49eb-990f-451ad31d0e21\") " Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.940671 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/43d42bf1-de55-49eb-990f-451ad31d0e21-rabbitmq-erlang-cookie\") pod \"43d42bf1-de55-49eb-990f-451ad31d0e21\" (UID: \"43d42bf1-de55-49eb-990f-451ad31d0e21\") " Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.940743 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"43d42bf1-de55-49eb-990f-451ad31d0e21\" (UID: \"43d42bf1-de55-49eb-990f-451ad31d0e21\") " Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.940776 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/43d42bf1-de55-49eb-990f-451ad31d0e21-plugins-conf\") pod \"43d42bf1-de55-49eb-990f-451ad31d0e21\" (UID: \"43d42bf1-de55-49eb-990f-451ad31d0e21\") " Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.940806 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/43d42bf1-de55-49eb-990f-451ad31d0e21-pod-info\") pod \"43d42bf1-de55-49eb-990f-451ad31d0e21\" (UID: \"43d42bf1-de55-49eb-990f-451ad31d0e21\") " Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.940835 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/43d42bf1-de55-49eb-990f-451ad31d0e21-rabbitmq-tls\") pod \"43d42bf1-de55-49eb-990f-451ad31d0e21\" (UID: \"43d42bf1-de55-49eb-990f-451ad31d0e21\") " Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.940860 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/43d42bf1-de55-49eb-990f-451ad31d0e21-server-conf\") pod \"43d42bf1-de55-49eb-990f-451ad31d0e21\" (UID: \"43d42bf1-de55-49eb-990f-451ad31d0e21\") " Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.942503 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43d42bf1-de55-49eb-990f-451ad31d0e21-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "43d42bf1-de55-49eb-990f-451ad31d0e21" (UID: "43d42bf1-de55-49eb-990f-451ad31d0e21"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.942528 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43d42bf1-de55-49eb-990f-451ad31d0e21-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "43d42bf1-de55-49eb-990f-451ad31d0e21" (UID: "43d42bf1-de55-49eb-990f-451ad31d0e21"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.943561 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43d42bf1-de55-49eb-990f-451ad31d0e21-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "43d42bf1-de55-49eb-990f-451ad31d0e21" (UID: "43d42bf1-de55-49eb-990f-451ad31d0e21"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.947140 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "43d42bf1-de55-49eb-990f-451ad31d0e21" (UID: "43d42bf1-de55-49eb-990f-451ad31d0e21"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.947969 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43d42bf1-de55-49eb-990f-451ad31d0e21-kube-api-access-pql4w" (OuterVolumeSpecName: "kube-api-access-pql4w") pod "43d42bf1-de55-49eb-990f-451ad31d0e21" (UID: "43d42bf1-de55-49eb-990f-451ad31d0e21"). InnerVolumeSpecName "kube-api-access-pql4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.962357 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/43d42bf1-de55-49eb-990f-451ad31d0e21-pod-info" (OuterVolumeSpecName: "pod-info") pod "43d42bf1-de55-49eb-990f-451ad31d0e21" (UID: "43d42bf1-de55-49eb-990f-451ad31d0e21"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.962719 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43d42bf1-de55-49eb-990f-451ad31d0e21-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "43d42bf1-de55-49eb-990f-451ad31d0e21" (UID: "43d42bf1-de55-49eb-990f-451ad31d0e21"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.962821 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43d42bf1-de55-49eb-990f-451ad31d0e21-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "43d42bf1-de55-49eb-990f-451ad31d0e21" (UID: "43d42bf1-de55-49eb-990f-451ad31d0e21"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.975879 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 22 04:31:39 crc kubenswrapper[4699]: I1122 04:31:39.993377 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43d42bf1-de55-49eb-990f-451ad31d0e21-config-data" (OuterVolumeSpecName: "config-data") pod "43d42bf1-de55-49eb-990f-451ad31d0e21" (UID: "43d42bf1-de55-49eb-990f-451ad31d0e21"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.018279 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43d42bf1-de55-49eb-990f-451ad31d0e21-server-conf" (OuterVolumeSpecName: "server-conf") pod "43d42bf1-de55-49eb-990f-451ad31d0e21" (UID: "43d42bf1-de55-49eb-990f-451ad31d0e21"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.042751 4699 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.042790 4699 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/43d42bf1-de55-49eb-990f-451ad31d0e21-plugins-conf\") on node \"crc\" DevicePath \"\"" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.042801 4699 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/43d42bf1-de55-49eb-990f-451ad31d0e21-pod-info\") on node \"crc\" DevicePath \"\"" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.042809 4699 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/43d42bf1-de55-49eb-990f-451ad31d0e21-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.042817 4699 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/43d42bf1-de55-49eb-990f-451ad31d0e21-server-conf\") on node \"crc\" DevicePath \"\"" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.042825 4699 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/43d42bf1-de55-49eb-990f-451ad31d0e21-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.042834 4699 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/43d42bf1-de55-49eb-990f-451ad31d0e21-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.042842 4699 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/43d42bf1-de55-49eb-990f-451ad31d0e21-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.042851 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pql4w\" (UniqueName: \"kubernetes.io/projected/43d42bf1-de55-49eb-990f-451ad31d0e21-kube-api-access-pql4w\") on node \"crc\" DevicePath \"\"" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.042859 4699 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/43d42bf1-de55-49eb-990f-451ad31d0e21-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.063595 4699 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.079098 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43d42bf1-de55-49eb-990f-451ad31d0e21-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "43d42bf1-de55-49eb-990f-451ad31d0e21" (UID: "43d42bf1-de55-49eb-990f-451ad31d0e21"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.144950 4699 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.144984 4699 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/43d42bf1-de55-49eb-990f-451ad31d0e21-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.252143 4699 generic.go:334] "Generic (PLEG): container finished" podID="43d42bf1-de55-49eb-990f-451ad31d0e21" containerID="46c87d2a62fb4ca084c1e3345da9baf035ce1b5a33e1a4c13e4dfe636abd19e2" exitCode=0 Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.252193 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"43d42bf1-de55-49eb-990f-451ad31d0e21","Type":"ContainerDied","Data":"46c87d2a62fb4ca084c1e3345da9baf035ce1b5a33e1a4c13e4dfe636abd19e2"} Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.252223 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"43d42bf1-de55-49eb-990f-451ad31d0e21","Type":"ContainerDied","Data":"150006f975a9f60f659431357eac4f00f61ab9231d38d7e5a06fc25b10419734"} Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.252230 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.252248 4699 scope.go:117] "RemoveContainer" containerID="46c87d2a62fb4ca084c1e3345da9baf035ce1b5a33e1a4c13e4dfe636abd19e2" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.277148 4699 scope.go:117] "RemoveContainer" containerID="54cce131e13a928cbbe825a4a23558e4febfb3b63b057f90b289d3f8f7b28d49" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.297738 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.328203 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.329015 4699 scope.go:117] "RemoveContainer" containerID="46c87d2a62fb4ca084c1e3345da9baf035ce1b5a33e1a4c13e4dfe636abd19e2" Nov 22 04:31:40 crc kubenswrapper[4699]: E1122 04:31:40.329248 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46c87d2a62fb4ca084c1e3345da9baf035ce1b5a33e1a4c13e4dfe636abd19e2\": container with ID starting with 46c87d2a62fb4ca084c1e3345da9baf035ce1b5a33e1a4c13e4dfe636abd19e2 not found: ID does not exist" containerID="46c87d2a62fb4ca084c1e3345da9baf035ce1b5a33e1a4c13e4dfe636abd19e2" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.329277 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46c87d2a62fb4ca084c1e3345da9baf035ce1b5a33e1a4c13e4dfe636abd19e2"} err="failed to get container status \"46c87d2a62fb4ca084c1e3345da9baf035ce1b5a33e1a4c13e4dfe636abd19e2\": rpc error: code = NotFound desc = could not find container \"46c87d2a62fb4ca084c1e3345da9baf035ce1b5a33e1a4c13e4dfe636abd19e2\": container with ID starting with 46c87d2a62fb4ca084c1e3345da9baf035ce1b5a33e1a4c13e4dfe636abd19e2 not found: ID does not exist" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.329297 4699 scope.go:117] "RemoveContainer" containerID="54cce131e13a928cbbe825a4a23558e4febfb3b63b057f90b289d3f8f7b28d49" Nov 22 04:31:40 crc kubenswrapper[4699]: E1122 04:31:40.329520 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54cce131e13a928cbbe825a4a23558e4febfb3b63b057f90b289d3f8f7b28d49\": container with ID starting with 54cce131e13a928cbbe825a4a23558e4febfb3b63b057f90b289d3f8f7b28d49 not found: ID does not exist" containerID="54cce131e13a928cbbe825a4a23558e4febfb3b63b057f90b289d3f8f7b28d49" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.329541 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54cce131e13a928cbbe825a4a23558e4febfb3b63b057f90b289d3f8f7b28d49"} err="failed to get container status \"54cce131e13a928cbbe825a4a23558e4febfb3b63b057f90b289d3f8f7b28d49\": rpc error: code = NotFound desc = could not find container \"54cce131e13a928cbbe825a4a23558e4febfb3b63b057f90b289d3f8f7b28d49\": container with ID starting with 54cce131e13a928cbbe825a4a23558e4febfb3b63b057f90b289d3f8f7b28d49 not found: ID does not exist" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.339559 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 04:31:40 crc kubenswrapper[4699]: E1122 04:31:40.340183 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43d42bf1-de55-49eb-990f-451ad31d0e21" containerName="setup-container" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.340202 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="43d42bf1-de55-49eb-990f-451ad31d0e21" containerName="setup-container" Nov 22 04:31:40 crc kubenswrapper[4699]: E1122 04:31:40.340229 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43d42bf1-de55-49eb-990f-451ad31d0e21" containerName="rabbitmq" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.340239 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="43d42bf1-de55-49eb-990f-451ad31d0e21" containerName="rabbitmq" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.340510 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="43d42bf1-de55-49eb-990f-451ad31d0e21" containerName="rabbitmq" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.341932 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.346314 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.346511 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.347551 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.347966 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.348279 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.349119 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-s5km4" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.350240 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.354835 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.453816 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/522fc300-2659-442f-9311-65aa82b05e99-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"522fc300-2659-442f-9311-65aa82b05e99\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.453875 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"522fc300-2659-442f-9311-65aa82b05e99\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.453901 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/522fc300-2659-442f-9311-65aa82b05e99-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"522fc300-2659-442f-9311-65aa82b05e99\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.453955 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/522fc300-2659-442f-9311-65aa82b05e99-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"522fc300-2659-442f-9311-65aa82b05e99\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.453995 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/522fc300-2659-442f-9311-65aa82b05e99-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"522fc300-2659-442f-9311-65aa82b05e99\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.454019 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/522fc300-2659-442f-9311-65aa82b05e99-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"522fc300-2659-442f-9311-65aa82b05e99\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.454042 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/522fc300-2659-442f-9311-65aa82b05e99-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"522fc300-2659-442f-9311-65aa82b05e99\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.454059 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/522fc300-2659-442f-9311-65aa82b05e99-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"522fc300-2659-442f-9311-65aa82b05e99\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.454094 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvn49\" (UniqueName: \"kubernetes.io/projected/522fc300-2659-442f-9311-65aa82b05e99-kube-api-access-xvn49\") pod \"rabbitmq-cell1-server-0\" (UID: \"522fc300-2659-442f-9311-65aa82b05e99\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.454136 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/522fc300-2659-442f-9311-65aa82b05e99-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"522fc300-2659-442f-9311-65aa82b05e99\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.454153 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/522fc300-2659-442f-9311-65aa82b05e99-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"522fc300-2659-442f-9311-65aa82b05e99\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.471644 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.560879 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/522fc300-2659-442f-9311-65aa82b05e99-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"522fc300-2659-442f-9311-65aa82b05e99\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.560972 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/522fc300-2659-442f-9311-65aa82b05e99-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"522fc300-2659-442f-9311-65aa82b05e99\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.561005 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/522fc300-2659-442f-9311-65aa82b05e99-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"522fc300-2659-442f-9311-65aa82b05e99\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.561023 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/522fc300-2659-442f-9311-65aa82b05e99-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"522fc300-2659-442f-9311-65aa82b05e99\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.561040 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/522fc300-2659-442f-9311-65aa82b05e99-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"522fc300-2659-442f-9311-65aa82b05e99\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.561076 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvn49\" (UniqueName: \"kubernetes.io/projected/522fc300-2659-442f-9311-65aa82b05e99-kube-api-access-xvn49\") pod \"rabbitmq-cell1-server-0\" (UID: \"522fc300-2659-442f-9311-65aa82b05e99\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.561128 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/522fc300-2659-442f-9311-65aa82b05e99-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"522fc300-2659-442f-9311-65aa82b05e99\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.561147 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/522fc300-2659-442f-9311-65aa82b05e99-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"522fc300-2659-442f-9311-65aa82b05e99\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.561291 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/522fc300-2659-442f-9311-65aa82b05e99-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"522fc300-2659-442f-9311-65aa82b05e99\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.561338 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"522fc300-2659-442f-9311-65aa82b05e99\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.561363 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/522fc300-2659-442f-9311-65aa82b05e99-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"522fc300-2659-442f-9311-65aa82b05e99\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.562412 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/522fc300-2659-442f-9311-65aa82b05e99-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"522fc300-2659-442f-9311-65aa82b05e99\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.562606 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/522fc300-2659-442f-9311-65aa82b05e99-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"522fc300-2659-442f-9311-65aa82b05e99\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.562794 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/522fc300-2659-442f-9311-65aa82b05e99-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"522fc300-2659-442f-9311-65aa82b05e99\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.564248 4699 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"522fc300-2659-442f-9311-65aa82b05e99\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.564814 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/522fc300-2659-442f-9311-65aa82b05e99-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"522fc300-2659-442f-9311-65aa82b05e99\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.567593 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/522fc300-2659-442f-9311-65aa82b05e99-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"522fc300-2659-442f-9311-65aa82b05e99\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.569454 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/522fc300-2659-442f-9311-65aa82b05e99-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"522fc300-2659-442f-9311-65aa82b05e99\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.569686 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/522fc300-2659-442f-9311-65aa82b05e99-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"522fc300-2659-442f-9311-65aa82b05e99\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.570956 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/522fc300-2659-442f-9311-65aa82b05e99-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"522fc300-2659-442f-9311-65aa82b05e99\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.573649 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/522fc300-2659-442f-9311-65aa82b05e99-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"522fc300-2659-442f-9311-65aa82b05e99\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.584400 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvn49\" (UniqueName: \"kubernetes.io/projected/522fc300-2659-442f-9311-65aa82b05e99-kube-api-access-xvn49\") pod \"rabbitmq-cell1-server-0\" (UID: \"522fc300-2659-442f-9311-65aa82b05e99\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.599302 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"522fc300-2659-442f-9311-65aa82b05e99\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:31:40 crc kubenswrapper[4699]: I1122 04:31:40.680566 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:31:41 crc kubenswrapper[4699]: I1122 04:31:41.139560 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 04:31:41 crc kubenswrapper[4699]: I1122 04:31:41.283693 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"522fc300-2659-442f-9311-65aa82b05e99","Type":"ContainerStarted","Data":"bbef5c22bd4585574c1f8f176a77ae63d457a4f464f487be70606676a27eda78"} Nov 22 04:31:41 crc kubenswrapper[4699]: I1122 04:31:41.285531 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8","Type":"ContainerStarted","Data":"db7438c8cdbff0f233361bbe50fdd2e37eb3b768f922f021d3f0d28304adec7f"} Nov 22 04:31:41 crc kubenswrapper[4699]: I1122 04:31:41.459094 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43d42bf1-de55-49eb-990f-451ad31d0e21" path="/var/lib/kubelet/pods/43d42bf1-de55-49eb-990f-451ad31d0e21/volumes" Nov 22 04:31:42 crc kubenswrapper[4699]: I1122 04:31:42.296764 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8","Type":"ContainerStarted","Data":"76c28962245e247522a218ca28306c7beb9085fa684551c13ac520655827cbb0"} Nov 22 04:31:43 crc kubenswrapper[4699]: I1122 04:31:43.309155 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"522fc300-2659-442f-9311-65aa82b05e99","Type":"ContainerStarted","Data":"40e31c5ddedfa3cfd1614adcc07e8d538bd9a5f8bdceb57432198385f3a0d25f"} Nov 22 04:32:08 crc kubenswrapper[4699]: I1122 04:32:08.727275 4699 patch_prober.go:28] interesting pod/machine-config-daemon-kjwnt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:32:08 crc kubenswrapper[4699]: I1122 04:32:08.728518 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:32:08 crc kubenswrapper[4699]: I1122 04:32:08.728639 4699 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" Nov 22 04:32:08 crc kubenswrapper[4699]: I1122 04:32:08.729424 4699 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f9b23a2370657a76cf1f4f279dceac7c7bb8c31dc2586215719f3f3336390722"} pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 04:32:08 crc kubenswrapper[4699]: I1122 04:32:08.729606 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerName="machine-config-daemon" containerID="cri-o://f9b23a2370657a76cf1f4f279dceac7c7bb8c31dc2586215719f3f3336390722" gracePeriod=600 Nov 22 04:32:09 crc kubenswrapper[4699]: I1122 04:32:09.554585 4699 generic.go:334] "Generic (PLEG): container finished" podID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerID="f9b23a2370657a76cf1f4f279dceac7c7bb8c31dc2586215719f3f3336390722" exitCode=0 Nov 22 04:32:09 crc kubenswrapper[4699]: I1122 04:32:09.554632 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" event={"ID":"41bdbae2-706a-4f84-9f56-5a42aec77762","Type":"ContainerDied","Data":"f9b23a2370657a76cf1f4f279dceac7c7bb8c31dc2586215719f3f3336390722"} Nov 22 04:32:09 crc kubenswrapper[4699]: I1122 04:32:09.555308 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" event={"ID":"41bdbae2-706a-4f84-9f56-5a42aec77762","Type":"ContainerStarted","Data":"9f6c8a2daef4dc5617a6b47fc5d58598238dea049bba2ad09b65bd85f946e581"} Nov 22 04:32:09 crc kubenswrapper[4699]: I1122 04:32:09.555331 4699 scope.go:117] "RemoveContainer" containerID="6069541dbe3b036cc4c74183802ec26cdc4e0a14a8ff9d64a37a60b66cc8ee5b" Nov 22 04:32:14 crc kubenswrapper[4699]: I1122 04:32:14.617671 4699 generic.go:334] "Generic (PLEG): container finished" podID="2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8" containerID="76c28962245e247522a218ca28306c7beb9085fa684551c13ac520655827cbb0" exitCode=0 Nov 22 04:32:14 crc kubenswrapper[4699]: I1122 04:32:14.617754 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8","Type":"ContainerDied","Data":"76c28962245e247522a218ca28306c7beb9085fa684551c13ac520655827cbb0"} Nov 22 04:32:15 crc kubenswrapper[4699]: I1122 04:32:15.628390 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8","Type":"ContainerStarted","Data":"24739a37cd78134e762ff32d652fc1e5bbec4fb306717dac903cc38837cb9816"} Nov 22 04:32:15 crc kubenswrapper[4699]: I1122 04:32:15.629244 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Nov 22 04:32:15 crc kubenswrapper[4699]: I1122 04:32:15.631703 4699 generic.go:334] "Generic (PLEG): container finished" podID="522fc300-2659-442f-9311-65aa82b05e99" containerID="40e31c5ddedfa3cfd1614adcc07e8d538bd9a5f8bdceb57432198385f3a0d25f" exitCode=0 Nov 22 04:32:15 crc kubenswrapper[4699]: I1122 04:32:15.631735 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"522fc300-2659-442f-9311-65aa82b05e99","Type":"ContainerDied","Data":"40e31c5ddedfa3cfd1614adcc07e8d538bd9a5f8bdceb57432198385f3a0d25f"} Nov 22 04:32:15 crc kubenswrapper[4699]: I1122 04:32:15.662034 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.662013949 podStartE2EDuration="36.662013949s" podCreationTimestamp="2025-11-22 04:31:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:32:15.653111743 +0000 UTC m=+1486.995732950" watchObservedRunningTime="2025-11-22 04:32:15.662013949 +0000 UTC m=+1487.004635146" Nov 22 04:32:16 crc kubenswrapper[4699]: I1122 04:32:16.642819 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"522fc300-2659-442f-9311-65aa82b05e99","Type":"ContainerStarted","Data":"281821eb5d0e6270042795382f17491c7ba136c02ef28513e0163be6a9da933f"} Nov 22 04:32:16 crc kubenswrapper[4699]: I1122 04:32:16.643446 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:32:16 crc kubenswrapper[4699]: I1122 04:32:16.666630 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.666609488 podStartE2EDuration="36.666609488s" podCreationTimestamp="2025-11-22 04:31:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:32:16.665787708 +0000 UTC m=+1488.008408915" watchObservedRunningTime="2025-11-22 04:32:16.666609488 +0000 UTC m=+1488.009230675" Nov 22 04:32:24 crc kubenswrapper[4699]: I1122 04:32:24.276067 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-m8wk4"] Nov 22 04:32:24 crc kubenswrapper[4699]: I1122 04:32:24.282340 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m8wk4" Nov 22 04:32:24 crc kubenswrapper[4699]: I1122 04:32:24.293275 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m8wk4"] Nov 22 04:32:24 crc kubenswrapper[4699]: I1122 04:32:24.445335 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8rqc\" (UniqueName: \"kubernetes.io/projected/43900818-d99d-4ce4-aaba-3d26457edc41-kube-api-access-z8rqc\") pod \"redhat-marketplace-m8wk4\" (UID: \"43900818-d99d-4ce4-aaba-3d26457edc41\") " pod="openshift-marketplace/redhat-marketplace-m8wk4" Nov 22 04:32:24 crc kubenswrapper[4699]: I1122 04:32:24.445464 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43900818-d99d-4ce4-aaba-3d26457edc41-utilities\") pod \"redhat-marketplace-m8wk4\" (UID: \"43900818-d99d-4ce4-aaba-3d26457edc41\") " pod="openshift-marketplace/redhat-marketplace-m8wk4" Nov 22 04:32:24 crc kubenswrapper[4699]: I1122 04:32:24.445844 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43900818-d99d-4ce4-aaba-3d26457edc41-catalog-content\") pod \"redhat-marketplace-m8wk4\" (UID: \"43900818-d99d-4ce4-aaba-3d26457edc41\") " pod="openshift-marketplace/redhat-marketplace-m8wk4" Nov 22 04:32:24 crc kubenswrapper[4699]: I1122 04:32:24.548789 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8rqc\" (UniqueName: \"kubernetes.io/projected/43900818-d99d-4ce4-aaba-3d26457edc41-kube-api-access-z8rqc\") pod \"redhat-marketplace-m8wk4\" (UID: \"43900818-d99d-4ce4-aaba-3d26457edc41\") " pod="openshift-marketplace/redhat-marketplace-m8wk4" Nov 22 04:32:24 crc kubenswrapper[4699]: I1122 04:32:24.549068 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43900818-d99d-4ce4-aaba-3d26457edc41-utilities\") pod \"redhat-marketplace-m8wk4\" (UID: \"43900818-d99d-4ce4-aaba-3d26457edc41\") " pod="openshift-marketplace/redhat-marketplace-m8wk4" Nov 22 04:32:24 crc kubenswrapper[4699]: I1122 04:32:24.549139 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43900818-d99d-4ce4-aaba-3d26457edc41-catalog-content\") pod \"redhat-marketplace-m8wk4\" (UID: \"43900818-d99d-4ce4-aaba-3d26457edc41\") " pod="openshift-marketplace/redhat-marketplace-m8wk4" Nov 22 04:32:24 crc kubenswrapper[4699]: I1122 04:32:24.549625 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43900818-d99d-4ce4-aaba-3d26457edc41-utilities\") pod \"redhat-marketplace-m8wk4\" (UID: \"43900818-d99d-4ce4-aaba-3d26457edc41\") " pod="openshift-marketplace/redhat-marketplace-m8wk4" Nov 22 04:32:24 crc kubenswrapper[4699]: I1122 04:32:24.549663 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43900818-d99d-4ce4-aaba-3d26457edc41-catalog-content\") pod \"redhat-marketplace-m8wk4\" (UID: \"43900818-d99d-4ce4-aaba-3d26457edc41\") " pod="openshift-marketplace/redhat-marketplace-m8wk4" Nov 22 04:32:24 crc kubenswrapper[4699]: I1122 04:32:24.570018 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8rqc\" (UniqueName: \"kubernetes.io/projected/43900818-d99d-4ce4-aaba-3d26457edc41-kube-api-access-z8rqc\") pod \"redhat-marketplace-m8wk4\" (UID: \"43900818-d99d-4ce4-aaba-3d26457edc41\") " pod="openshift-marketplace/redhat-marketplace-m8wk4" Nov 22 04:32:24 crc kubenswrapper[4699]: I1122 04:32:24.616518 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m8wk4" Nov 22 04:32:25 crc kubenswrapper[4699]: I1122 04:32:25.097390 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m8wk4"] Nov 22 04:32:25 crc kubenswrapper[4699]: I1122 04:32:25.740381 4699 generic.go:334] "Generic (PLEG): container finished" podID="43900818-d99d-4ce4-aaba-3d26457edc41" containerID="604b185f80b880513bea14eb7445629cd91f8510ba2f139c84f32e7f73acc9dc" exitCode=0 Nov 22 04:32:25 crc kubenswrapper[4699]: I1122 04:32:25.740421 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m8wk4" event={"ID":"43900818-d99d-4ce4-aaba-3d26457edc41","Type":"ContainerDied","Data":"604b185f80b880513bea14eb7445629cd91f8510ba2f139c84f32e7f73acc9dc"} Nov 22 04:32:25 crc kubenswrapper[4699]: I1122 04:32:25.741617 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m8wk4" event={"ID":"43900818-d99d-4ce4-aaba-3d26457edc41","Type":"ContainerStarted","Data":"88b25c26f98a63ca0e92e54acd618b4dc239ce76504eb0cf114d9491e364aca7"} Nov 22 04:32:26 crc kubenswrapper[4699]: I1122 04:32:26.767198 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m8wk4" event={"ID":"43900818-d99d-4ce4-aaba-3d26457edc41","Type":"ContainerStarted","Data":"137c7a1c69b5c7c84dd52c921c39b681e9f120776eacad20c5f5975b5612a718"} Nov 22 04:32:27 crc kubenswrapper[4699]: I1122 04:32:27.777173 4699 generic.go:334] "Generic (PLEG): container finished" podID="43900818-d99d-4ce4-aaba-3d26457edc41" containerID="137c7a1c69b5c7c84dd52c921c39b681e9f120776eacad20c5f5975b5612a718" exitCode=0 Nov 22 04:32:27 crc kubenswrapper[4699]: I1122 04:32:27.777221 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m8wk4" event={"ID":"43900818-d99d-4ce4-aaba-3d26457edc41","Type":"ContainerDied","Data":"137c7a1c69b5c7c84dd52c921c39b681e9f120776eacad20c5f5975b5612a718"} Nov 22 04:32:28 crc kubenswrapper[4699]: I1122 04:32:28.791247 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m8wk4" event={"ID":"43900818-d99d-4ce4-aaba-3d26457edc41","Type":"ContainerStarted","Data":"508f30ce32ad4de6911bfc619408bf27fbed318315bae06535204e23674c3f9a"} Nov 22 04:32:28 crc kubenswrapper[4699]: I1122 04:32:28.814927 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-m8wk4" podStartSLOduration=2.270764334 podStartE2EDuration="4.814910065s" podCreationTimestamp="2025-11-22 04:32:24 +0000 UTC" firstStartedPulling="2025-11-22 04:32:25.743255633 +0000 UTC m=+1497.085876820" lastFinishedPulling="2025-11-22 04:32:28.287401364 +0000 UTC m=+1499.630022551" observedRunningTime="2025-11-22 04:32:28.811059712 +0000 UTC m=+1500.153680909" watchObservedRunningTime="2025-11-22 04:32:28.814910065 +0000 UTC m=+1500.157531252" Nov 22 04:32:29 crc kubenswrapper[4699]: I1122 04:32:29.979626 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Nov 22 04:32:30 crc kubenswrapper[4699]: I1122 04:32:30.683011 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Nov 22 04:32:31 crc kubenswrapper[4699]: I1122 04:32:31.942352 4699 scope.go:117] "RemoveContainer" containerID="e19911f188c7fc263ab27377da1b53e1a0be2f60dcb34322a9569882b57b1755" Nov 22 04:32:31 crc kubenswrapper[4699]: I1122 04:32:31.978202 4699 scope.go:117] "RemoveContainer" containerID="22cbc53f0e4539ab600ee81cc3fc691c24f4f65d650e4b1a822c0d12ea4098a1" Nov 22 04:32:32 crc kubenswrapper[4699]: I1122 04:32:32.012459 4699 scope.go:117] "RemoveContainer" containerID="a1a5b0166aeaff548a75355e9677f3747a6bae4fbd895a37d195f1a79b30ca91" Nov 22 04:32:32 crc kubenswrapper[4699]: I1122 04:32:32.032296 4699 scope.go:117] "RemoveContainer" containerID="c32e2d1bb5fc1bc7f8e4e31e454af69322bf5fa9a4fdd1bf53a07d6c8999d8ed" Nov 22 04:32:34 crc kubenswrapper[4699]: I1122 04:32:34.617724 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-m8wk4" Nov 22 04:32:34 crc kubenswrapper[4699]: I1122 04:32:34.618305 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-m8wk4" Nov 22 04:32:34 crc kubenswrapper[4699]: I1122 04:32:34.668295 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-m8wk4" Nov 22 04:32:34 crc kubenswrapper[4699]: I1122 04:32:34.915124 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-m8wk4" Nov 22 04:32:34 crc kubenswrapper[4699]: I1122 04:32:34.957796 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m8wk4"] Nov 22 04:32:36 crc kubenswrapper[4699]: I1122 04:32:36.887798 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-m8wk4" podUID="43900818-d99d-4ce4-aaba-3d26457edc41" containerName="registry-server" containerID="cri-o://508f30ce32ad4de6911bfc619408bf27fbed318315bae06535204e23674c3f9a" gracePeriod=2 Nov 22 04:32:37 crc kubenswrapper[4699]: I1122 04:32:37.384401 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m8wk4" Nov 22 04:32:37 crc kubenswrapper[4699]: I1122 04:32:37.505074 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8rqc\" (UniqueName: \"kubernetes.io/projected/43900818-d99d-4ce4-aaba-3d26457edc41-kube-api-access-z8rqc\") pod \"43900818-d99d-4ce4-aaba-3d26457edc41\" (UID: \"43900818-d99d-4ce4-aaba-3d26457edc41\") " Nov 22 04:32:37 crc kubenswrapper[4699]: I1122 04:32:37.505210 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43900818-d99d-4ce4-aaba-3d26457edc41-catalog-content\") pod \"43900818-d99d-4ce4-aaba-3d26457edc41\" (UID: \"43900818-d99d-4ce4-aaba-3d26457edc41\") " Nov 22 04:32:37 crc kubenswrapper[4699]: I1122 04:32:37.505354 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43900818-d99d-4ce4-aaba-3d26457edc41-utilities\") pod \"43900818-d99d-4ce4-aaba-3d26457edc41\" (UID: \"43900818-d99d-4ce4-aaba-3d26457edc41\") " Nov 22 04:32:37 crc kubenswrapper[4699]: I1122 04:32:37.506476 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43900818-d99d-4ce4-aaba-3d26457edc41-utilities" (OuterVolumeSpecName: "utilities") pod "43900818-d99d-4ce4-aaba-3d26457edc41" (UID: "43900818-d99d-4ce4-aaba-3d26457edc41"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:32:37 crc kubenswrapper[4699]: I1122 04:32:37.511979 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43900818-d99d-4ce4-aaba-3d26457edc41-kube-api-access-z8rqc" (OuterVolumeSpecName: "kube-api-access-z8rqc") pod "43900818-d99d-4ce4-aaba-3d26457edc41" (UID: "43900818-d99d-4ce4-aaba-3d26457edc41"). InnerVolumeSpecName "kube-api-access-z8rqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:32:37 crc kubenswrapper[4699]: I1122 04:32:37.598071 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43900818-d99d-4ce4-aaba-3d26457edc41-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "43900818-d99d-4ce4-aaba-3d26457edc41" (UID: "43900818-d99d-4ce4-aaba-3d26457edc41"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:32:37 crc kubenswrapper[4699]: I1122 04:32:37.607713 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43900818-d99d-4ce4-aaba-3d26457edc41-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:32:37 crc kubenswrapper[4699]: I1122 04:32:37.607749 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8rqc\" (UniqueName: \"kubernetes.io/projected/43900818-d99d-4ce4-aaba-3d26457edc41-kube-api-access-z8rqc\") on node \"crc\" DevicePath \"\"" Nov 22 04:32:37 crc kubenswrapper[4699]: I1122 04:32:37.607761 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43900818-d99d-4ce4-aaba-3d26457edc41-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:32:37 crc kubenswrapper[4699]: I1122 04:32:37.900176 4699 generic.go:334] "Generic (PLEG): container finished" podID="43900818-d99d-4ce4-aaba-3d26457edc41" containerID="508f30ce32ad4de6911bfc619408bf27fbed318315bae06535204e23674c3f9a" exitCode=0 Nov 22 04:32:37 crc kubenswrapper[4699]: I1122 04:32:37.900225 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m8wk4" Nov 22 04:32:37 crc kubenswrapper[4699]: I1122 04:32:37.900223 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m8wk4" event={"ID":"43900818-d99d-4ce4-aaba-3d26457edc41","Type":"ContainerDied","Data":"508f30ce32ad4de6911bfc619408bf27fbed318315bae06535204e23674c3f9a"} Nov 22 04:32:37 crc kubenswrapper[4699]: I1122 04:32:37.900331 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m8wk4" event={"ID":"43900818-d99d-4ce4-aaba-3d26457edc41","Type":"ContainerDied","Data":"88b25c26f98a63ca0e92e54acd618b4dc239ce76504eb0cf114d9491e364aca7"} Nov 22 04:32:37 crc kubenswrapper[4699]: I1122 04:32:37.900357 4699 scope.go:117] "RemoveContainer" containerID="508f30ce32ad4de6911bfc619408bf27fbed318315bae06535204e23674c3f9a" Nov 22 04:32:37 crc kubenswrapper[4699]: I1122 04:32:37.932878 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m8wk4"] Nov 22 04:32:37 crc kubenswrapper[4699]: I1122 04:32:37.938594 4699 scope.go:117] "RemoveContainer" containerID="137c7a1c69b5c7c84dd52c921c39b681e9f120776eacad20c5f5975b5612a718" Nov 22 04:32:37 crc kubenswrapper[4699]: I1122 04:32:37.941331 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-m8wk4"] Nov 22 04:32:37 crc kubenswrapper[4699]: I1122 04:32:37.965820 4699 scope.go:117] "RemoveContainer" containerID="604b185f80b880513bea14eb7445629cd91f8510ba2f139c84f32e7f73acc9dc" Nov 22 04:32:38 crc kubenswrapper[4699]: I1122 04:32:38.006335 4699 scope.go:117] "RemoveContainer" containerID="508f30ce32ad4de6911bfc619408bf27fbed318315bae06535204e23674c3f9a" Nov 22 04:32:38 crc kubenswrapper[4699]: E1122 04:32:38.006901 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"508f30ce32ad4de6911bfc619408bf27fbed318315bae06535204e23674c3f9a\": container with ID starting with 508f30ce32ad4de6911bfc619408bf27fbed318315bae06535204e23674c3f9a not found: ID does not exist" containerID="508f30ce32ad4de6911bfc619408bf27fbed318315bae06535204e23674c3f9a" Nov 22 04:32:38 crc kubenswrapper[4699]: I1122 04:32:38.006937 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"508f30ce32ad4de6911bfc619408bf27fbed318315bae06535204e23674c3f9a"} err="failed to get container status \"508f30ce32ad4de6911bfc619408bf27fbed318315bae06535204e23674c3f9a\": rpc error: code = NotFound desc = could not find container \"508f30ce32ad4de6911bfc619408bf27fbed318315bae06535204e23674c3f9a\": container with ID starting with 508f30ce32ad4de6911bfc619408bf27fbed318315bae06535204e23674c3f9a not found: ID does not exist" Nov 22 04:32:38 crc kubenswrapper[4699]: I1122 04:32:38.006962 4699 scope.go:117] "RemoveContainer" containerID="137c7a1c69b5c7c84dd52c921c39b681e9f120776eacad20c5f5975b5612a718" Nov 22 04:32:38 crc kubenswrapper[4699]: E1122 04:32:38.007230 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"137c7a1c69b5c7c84dd52c921c39b681e9f120776eacad20c5f5975b5612a718\": container with ID starting with 137c7a1c69b5c7c84dd52c921c39b681e9f120776eacad20c5f5975b5612a718 not found: ID does not exist" containerID="137c7a1c69b5c7c84dd52c921c39b681e9f120776eacad20c5f5975b5612a718" Nov 22 04:32:38 crc kubenswrapper[4699]: I1122 04:32:38.007254 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"137c7a1c69b5c7c84dd52c921c39b681e9f120776eacad20c5f5975b5612a718"} err="failed to get container status \"137c7a1c69b5c7c84dd52c921c39b681e9f120776eacad20c5f5975b5612a718\": rpc error: code = NotFound desc = could not find container \"137c7a1c69b5c7c84dd52c921c39b681e9f120776eacad20c5f5975b5612a718\": container with ID starting with 137c7a1c69b5c7c84dd52c921c39b681e9f120776eacad20c5f5975b5612a718 not found: ID does not exist" Nov 22 04:32:38 crc kubenswrapper[4699]: I1122 04:32:38.007272 4699 scope.go:117] "RemoveContainer" containerID="604b185f80b880513bea14eb7445629cd91f8510ba2f139c84f32e7f73acc9dc" Nov 22 04:32:38 crc kubenswrapper[4699]: E1122 04:32:38.007606 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"604b185f80b880513bea14eb7445629cd91f8510ba2f139c84f32e7f73acc9dc\": container with ID starting with 604b185f80b880513bea14eb7445629cd91f8510ba2f139c84f32e7f73acc9dc not found: ID does not exist" containerID="604b185f80b880513bea14eb7445629cd91f8510ba2f139c84f32e7f73acc9dc" Nov 22 04:32:38 crc kubenswrapper[4699]: I1122 04:32:38.007632 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"604b185f80b880513bea14eb7445629cd91f8510ba2f139c84f32e7f73acc9dc"} err="failed to get container status \"604b185f80b880513bea14eb7445629cd91f8510ba2f139c84f32e7f73acc9dc\": rpc error: code = NotFound desc = could not find container \"604b185f80b880513bea14eb7445629cd91f8510ba2f139c84f32e7f73acc9dc\": container with ID starting with 604b185f80b880513bea14eb7445629cd91f8510ba2f139c84f32e7f73acc9dc not found: ID does not exist" Nov 22 04:32:39 crc kubenswrapper[4699]: I1122 04:32:39.464228 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43900818-d99d-4ce4-aaba-3d26457edc41" path="/var/lib/kubelet/pods/43900818-d99d-4ce4-aaba-3d26457edc41/volumes" Nov 22 04:32:49 crc kubenswrapper[4699]: I1122 04:32:49.931715 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cfwhd"] Nov 22 04:32:49 crc kubenswrapper[4699]: E1122 04:32:49.932725 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43900818-d99d-4ce4-aaba-3d26457edc41" containerName="registry-server" Nov 22 04:32:49 crc kubenswrapper[4699]: I1122 04:32:49.932741 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="43900818-d99d-4ce4-aaba-3d26457edc41" containerName="registry-server" Nov 22 04:32:49 crc kubenswrapper[4699]: E1122 04:32:49.932763 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43900818-d99d-4ce4-aaba-3d26457edc41" containerName="extract-utilities" Nov 22 04:32:49 crc kubenswrapper[4699]: I1122 04:32:49.932770 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="43900818-d99d-4ce4-aaba-3d26457edc41" containerName="extract-utilities" Nov 22 04:32:49 crc kubenswrapper[4699]: E1122 04:32:49.932779 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43900818-d99d-4ce4-aaba-3d26457edc41" containerName="extract-content" Nov 22 04:32:49 crc kubenswrapper[4699]: I1122 04:32:49.932785 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="43900818-d99d-4ce4-aaba-3d26457edc41" containerName="extract-content" Nov 22 04:32:49 crc kubenswrapper[4699]: I1122 04:32:49.932976 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="43900818-d99d-4ce4-aaba-3d26457edc41" containerName="registry-server" Nov 22 04:32:49 crc kubenswrapper[4699]: I1122 04:32:49.934358 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cfwhd" Nov 22 04:32:49 crc kubenswrapper[4699]: I1122 04:32:49.943634 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl5pn\" (UniqueName: \"kubernetes.io/projected/8b836904-a497-47a0-b0ae-38728a8b01df-kube-api-access-hl5pn\") pod \"community-operators-cfwhd\" (UID: \"8b836904-a497-47a0-b0ae-38728a8b01df\") " pod="openshift-marketplace/community-operators-cfwhd" Nov 22 04:32:49 crc kubenswrapper[4699]: I1122 04:32:49.943742 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b836904-a497-47a0-b0ae-38728a8b01df-catalog-content\") pod \"community-operators-cfwhd\" (UID: \"8b836904-a497-47a0-b0ae-38728a8b01df\") " pod="openshift-marketplace/community-operators-cfwhd" Nov 22 04:32:49 crc kubenswrapper[4699]: I1122 04:32:49.944218 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b836904-a497-47a0-b0ae-38728a8b01df-utilities\") pod \"community-operators-cfwhd\" (UID: \"8b836904-a497-47a0-b0ae-38728a8b01df\") " pod="openshift-marketplace/community-operators-cfwhd" Nov 22 04:32:49 crc kubenswrapper[4699]: I1122 04:32:49.949066 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cfwhd"] Nov 22 04:32:50 crc kubenswrapper[4699]: I1122 04:32:50.047038 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b836904-a497-47a0-b0ae-38728a8b01df-utilities\") pod \"community-operators-cfwhd\" (UID: \"8b836904-a497-47a0-b0ae-38728a8b01df\") " pod="openshift-marketplace/community-operators-cfwhd" Nov 22 04:32:50 crc kubenswrapper[4699]: I1122 04:32:50.047124 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl5pn\" (UniqueName: \"kubernetes.io/projected/8b836904-a497-47a0-b0ae-38728a8b01df-kube-api-access-hl5pn\") pod \"community-operators-cfwhd\" (UID: \"8b836904-a497-47a0-b0ae-38728a8b01df\") " pod="openshift-marketplace/community-operators-cfwhd" Nov 22 04:32:50 crc kubenswrapper[4699]: I1122 04:32:50.047217 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b836904-a497-47a0-b0ae-38728a8b01df-catalog-content\") pod \"community-operators-cfwhd\" (UID: \"8b836904-a497-47a0-b0ae-38728a8b01df\") " pod="openshift-marketplace/community-operators-cfwhd" Nov 22 04:32:50 crc kubenswrapper[4699]: I1122 04:32:50.047654 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b836904-a497-47a0-b0ae-38728a8b01df-utilities\") pod \"community-operators-cfwhd\" (UID: \"8b836904-a497-47a0-b0ae-38728a8b01df\") " pod="openshift-marketplace/community-operators-cfwhd" Nov 22 04:32:50 crc kubenswrapper[4699]: I1122 04:32:50.047654 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b836904-a497-47a0-b0ae-38728a8b01df-catalog-content\") pod \"community-operators-cfwhd\" (UID: \"8b836904-a497-47a0-b0ae-38728a8b01df\") " pod="openshift-marketplace/community-operators-cfwhd" Nov 22 04:32:50 crc kubenswrapper[4699]: I1122 04:32:50.069078 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl5pn\" (UniqueName: \"kubernetes.io/projected/8b836904-a497-47a0-b0ae-38728a8b01df-kube-api-access-hl5pn\") pod \"community-operators-cfwhd\" (UID: \"8b836904-a497-47a0-b0ae-38728a8b01df\") " pod="openshift-marketplace/community-operators-cfwhd" Nov 22 04:32:50 crc kubenswrapper[4699]: I1122 04:32:50.279760 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cfwhd" Nov 22 04:32:50 crc kubenswrapper[4699]: I1122 04:32:50.853085 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cfwhd"] Nov 22 04:32:51 crc kubenswrapper[4699]: I1122 04:32:51.056850 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cfwhd" event={"ID":"8b836904-a497-47a0-b0ae-38728a8b01df","Type":"ContainerStarted","Data":"f2bf2578e365b4cb9291af69f486071521394bb0cb71781a716c59a7f8a1e58b"} Nov 22 04:32:52 crc kubenswrapper[4699]: I1122 04:32:52.066152 4699 generic.go:334] "Generic (PLEG): container finished" podID="8b836904-a497-47a0-b0ae-38728a8b01df" containerID="b3172cd87ab42f632135c931a813dbf478591a19ed8fa1daa14f4c495ee80516" exitCode=0 Nov 22 04:32:52 crc kubenswrapper[4699]: I1122 04:32:52.066206 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cfwhd" event={"ID":"8b836904-a497-47a0-b0ae-38728a8b01df","Type":"ContainerDied","Data":"b3172cd87ab42f632135c931a813dbf478591a19ed8fa1daa14f4c495ee80516"} Nov 22 04:32:55 crc kubenswrapper[4699]: I1122 04:32:55.094559 4699 generic.go:334] "Generic (PLEG): container finished" podID="8b836904-a497-47a0-b0ae-38728a8b01df" containerID="0c3b0e2d05a95fc61ee44598d2668cd7e37770fd623119feb678462f3e433636" exitCode=0 Nov 22 04:32:55 crc kubenswrapper[4699]: I1122 04:32:55.094646 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cfwhd" event={"ID":"8b836904-a497-47a0-b0ae-38728a8b01df","Type":"ContainerDied","Data":"0c3b0e2d05a95fc61ee44598d2668cd7e37770fd623119feb678462f3e433636"} Nov 22 04:32:58 crc kubenswrapper[4699]: I1122 04:32:58.135670 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cfwhd" event={"ID":"8b836904-a497-47a0-b0ae-38728a8b01df","Type":"ContainerStarted","Data":"0f9f7604a8a10ed5613bf3b2872d82173b6347ec97ad8ee1f84eecad1ee2aa17"} Nov 22 04:32:58 crc kubenswrapper[4699]: I1122 04:32:58.163910 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cfwhd" podStartSLOduration=3.6146512619999998 podStartE2EDuration="9.163878281s" podCreationTimestamp="2025-11-22 04:32:49 +0000 UTC" firstStartedPulling="2025-11-22 04:32:52.06981959 +0000 UTC m=+1523.412440777" lastFinishedPulling="2025-11-22 04:32:57.619046609 +0000 UTC m=+1528.961667796" observedRunningTime="2025-11-22 04:32:58.154609946 +0000 UTC m=+1529.497231143" watchObservedRunningTime="2025-11-22 04:32:58.163878281 +0000 UTC m=+1529.506499468" Nov 22 04:33:00 crc kubenswrapper[4699]: I1122 04:33:00.279914 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cfwhd" Nov 22 04:33:00 crc kubenswrapper[4699]: I1122 04:33:00.280306 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cfwhd" Nov 22 04:33:00 crc kubenswrapper[4699]: I1122 04:33:00.335930 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cfwhd" Nov 22 04:33:10 crc kubenswrapper[4699]: I1122 04:33:10.332013 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cfwhd" Nov 22 04:33:10 crc kubenswrapper[4699]: I1122 04:33:10.392885 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cfwhd"] Nov 22 04:33:11 crc kubenswrapper[4699]: I1122 04:33:11.267118 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cfwhd" podUID="8b836904-a497-47a0-b0ae-38728a8b01df" containerName="registry-server" containerID="cri-o://0f9f7604a8a10ed5613bf3b2872d82173b6347ec97ad8ee1f84eecad1ee2aa17" gracePeriod=2 Nov 22 04:33:11 crc kubenswrapper[4699]: I1122 04:33:11.774911 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cfwhd" Nov 22 04:33:11 crc kubenswrapper[4699]: I1122 04:33:11.819212 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b836904-a497-47a0-b0ae-38728a8b01df-utilities\") pod \"8b836904-a497-47a0-b0ae-38728a8b01df\" (UID: \"8b836904-a497-47a0-b0ae-38728a8b01df\") " Nov 22 04:33:11 crc kubenswrapper[4699]: I1122 04:33:11.819275 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hl5pn\" (UniqueName: \"kubernetes.io/projected/8b836904-a497-47a0-b0ae-38728a8b01df-kube-api-access-hl5pn\") pod \"8b836904-a497-47a0-b0ae-38728a8b01df\" (UID: \"8b836904-a497-47a0-b0ae-38728a8b01df\") " Nov 22 04:33:11 crc kubenswrapper[4699]: I1122 04:33:11.819327 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b836904-a497-47a0-b0ae-38728a8b01df-catalog-content\") pod \"8b836904-a497-47a0-b0ae-38728a8b01df\" (UID: \"8b836904-a497-47a0-b0ae-38728a8b01df\") " Nov 22 04:33:11 crc kubenswrapper[4699]: I1122 04:33:11.823488 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b836904-a497-47a0-b0ae-38728a8b01df-utilities" (OuterVolumeSpecName: "utilities") pod "8b836904-a497-47a0-b0ae-38728a8b01df" (UID: "8b836904-a497-47a0-b0ae-38728a8b01df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:33:11 crc kubenswrapper[4699]: I1122 04:33:11.828687 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b836904-a497-47a0-b0ae-38728a8b01df-kube-api-access-hl5pn" (OuterVolumeSpecName: "kube-api-access-hl5pn") pod "8b836904-a497-47a0-b0ae-38728a8b01df" (UID: "8b836904-a497-47a0-b0ae-38728a8b01df"). InnerVolumeSpecName "kube-api-access-hl5pn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:33:11 crc kubenswrapper[4699]: I1122 04:33:11.872757 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b836904-a497-47a0-b0ae-38728a8b01df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8b836904-a497-47a0-b0ae-38728a8b01df" (UID: "8b836904-a497-47a0-b0ae-38728a8b01df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:33:11 crc kubenswrapper[4699]: I1122 04:33:11.921344 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b836904-a497-47a0-b0ae-38728a8b01df-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:33:11 crc kubenswrapper[4699]: I1122 04:33:11.921378 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hl5pn\" (UniqueName: \"kubernetes.io/projected/8b836904-a497-47a0-b0ae-38728a8b01df-kube-api-access-hl5pn\") on node \"crc\" DevicePath \"\"" Nov 22 04:33:11 crc kubenswrapper[4699]: I1122 04:33:11.921389 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b836904-a497-47a0-b0ae-38728a8b01df-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:33:12 crc kubenswrapper[4699]: I1122 04:33:12.280139 4699 generic.go:334] "Generic (PLEG): container finished" podID="8b836904-a497-47a0-b0ae-38728a8b01df" containerID="0f9f7604a8a10ed5613bf3b2872d82173b6347ec97ad8ee1f84eecad1ee2aa17" exitCode=0 Nov 22 04:33:12 crc kubenswrapper[4699]: I1122 04:33:12.280194 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cfwhd" event={"ID":"8b836904-a497-47a0-b0ae-38728a8b01df","Type":"ContainerDied","Data":"0f9f7604a8a10ed5613bf3b2872d82173b6347ec97ad8ee1f84eecad1ee2aa17"} Nov 22 04:33:12 crc kubenswrapper[4699]: I1122 04:33:12.280214 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cfwhd" Nov 22 04:33:12 crc kubenswrapper[4699]: I1122 04:33:12.280240 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cfwhd" event={"ID":"8b836904-a497-47a0-b0ae-38728a8b01df","Type":"ContainerDied","Data":"f2bf2578e365b4cb9291af69f486071521394bb0cb71781a716c59a7f8a1e58b"} Nov 22 04:33:12 crc kubenswrapper[4699]: I1122 04:33:12.280266 4699 scope.go:117] "RemoveContainer" containerID="0f9f7604a8a10ed5613bf3b2872d82173b6347ec97ad8ee1f84eecad1ee2aa17" Nov 22 04:33:12 crc kubenswrapper[4699]: I1122 04:33:12.316934 4699 scope.go:117] "RemoveContainer" containerID="0c3b0e2d05a95fc61ee44598d2668cd7e37770fd623119feb678462f3e433636" Nov 22 04:33:12 crc kubenswrapper[4699]: I1122 04:33:12.324293 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cfwhd"] Nov 22 04:33:12 crc kubenswrapper[4699]: I1122 04:33:12.348106 4699 scope.go:117] "RemoveContainer" containerID="b3172cd87ab42f632135c931a813dbf478591a19ed8fa1daa14f4c495ee80516" Nov 22 04:33:12 crc kubenswrapper[4699]: I1122 04:33:12.359199 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cfwhd"] Nov 22 04:33:12 crc kubenswrapper[4699]: I1122 04:33:12.405665 4699 scope.go:117] "RemoveContainer" containerID="0f9f7604a8a10ed5613bf3b2872d82173b6347ec97ad8ee1f84eecad1ee2aa17" Nov 22 04:33:12 crc kubenswrapper[4699]: E1122 04:33:12.406233 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f9f7604a8a10ed5613bf3b2872d82173b6347ec97ad8ee1f84eecad1ee2aa17\": container with ID starting with 0f9f7604a8a10ed5613bf3b2872d82173b6347ec97ad8ee1f84eecad1ee2aa17 not found: ID does not exist" containerID="0f9f7604a8a10ed5613bf3b2872d82173b6347ec97ad8ee1f84eecad1ee2aa17" Nov 22 04:33:12 crc kubenswrapper[4699]: I1122 04:33:12.406269 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f9f7604a8a10ed5613bf3b2872d82173b6347ec97ad8ee1f84eecad1ee2aa17"} err="failed to get container status \"0f9f7604a8a10ed5613bf3b2872d82173b6347ec97ad8ee1f84eecad1ee2aa17\": rpc error: code = NotFound desc = could not find container \"0f9f7604a8a10ed5613bf3b2872d82173b6347ec97ad8ee1f84eecad1ee2aa17\": container with ID starting with 0f9f7604a8a10ed5613bf3b2872d82173b6347ec97ad8ee1f84eecad1ee2aa17 not found: ID does not exist" Nov 22 04:33:12 crc kubenswrapper[4699]: I1122 04:33:12.406296 4699 scope.go:117] "RemoveContainer" containerID="0c3b0e2d05a95fc61ee44598d2668cd7e37770fd623119feb678462f3e433636" Nov 22 04:33:12 crc kubenswrapper[4699]: E1122 04:33:12.406516 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c3b0e2d05a95fc61ee44598d2668cd7e37770fd623119feb678462f3e433636\": container with ID starting with 0c3b0e2d05a95fc61ee44598d2668cd7e37770fd623119feb678462f3e433636 not found: ID does not exist" containerID="0c3b0e2d05a95fc61ee44598d2668cd7e37770fd623119feb678462f3e433636" Nov 22 04:33:12 crc kubenswrapper[4699]: I1122 04:33:12.406545 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c3b0e2d05a95fc61ee44598d2668cd7e37770fd623119feb678462f3e433636"} err="failed to get container status \"0c3b0e2d05a95fc61ee44598d2668cd7e37770fd623119feb678462f3e433636\": rpc error: code = NotFound desc = could not find container \"0c3b0e2d05a95fc61ee44598d2668cd7e37770fd623119feb678462f3e433636\": container with ID starting with 0c3b0e2d05a95fc61ee44598d2668cd7e37770fd623119feb678462f3e433636 not found: ID does not exist" Nov 22 04:33:12 crc kubenswrapper[4699]: I1122 04:33:12.406562 4699 scope.go:117] "RemoveContainer" containerID="b3172cd87ab42f632135c931a813dbf478591a19ed8fa1daa14f4c495ee80516" Nov 22 04:33:12 crc kubenswrapper[4699]: E1122 04:33:12.406781 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3172cd87ab42f632135c931a813dbf478591a19ed8fa1daa14f4c495ee80516\": container with ID starting with b3172cd87ab42f632135c931a813dbf478591a19ed8fa1daa14f4c495ee80516 not found: ID does not exist" containerID="b3172cd87ab42f632135c931a813dbf478591a19ed8fa1daa14f4c495ee80516" Nov 22 04:33:12 crc kubenswrapper[4699]: I1122 04:33:12.406806 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3172cd87ab42f632135c931a813dbf478591a19ed8fa1daa14f4c495ee80516"} err="failed to get container status \"b3172cd87ab42f632135c931a813dbf478591a19ed8fa1daa14f4c495ee80516\": rpc error: code = NotFound desc = could not find container \"b3172cd87ab42f632135c931a813dbf478591a19ed8fa1daa14f4c495ee80516\": container with ID starting with b3172cd87ab42f632135c931a813dbf478591a19ed8fa1daa14f4c495ee80516 not found: ID does not exist" Nov 22 04:33:13 crc kubenswrapper[4699]: I1122 04:33:13.476136 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b836904-a497-47a0-b0ae-38728a8b01df" path="/var/lib/kubelet/pods/8b836904-a497-47a0-b0ae-38728a8b01df/volumes" Nov 22 04:33:32 crc kubenswrapper[4699]: I1122 04:33:32.149333 4699 scope.go:117] "RemoveContainer" containerID="1181a5b24efb2053c968a4aee3c0605356cb691d1a864b4dbc90d78b5c41283e" Nov 22 04:33:47 crc kubenswrapper[4699]: I1122 04:33:47.462074 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hkrq8"] Nov 22 04:33:47 crc kubenswrapper[4699]: E1122 04:33:47.463048 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b836904-a497-47a0-b0ae-38728a8b01df" containerName="registry-server" Nov 22 04:33:47 crc kubenswrapper[4699]: I1122 04:33:47.463066 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b836904-a497-47a0-b0ae-38728a8b01df" containerName="registry-server" Nov 22 04:33:47 crc kubenswrapper[4699]: E1122 04:33:47.463084 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b836904-a497-47a0-b0ae-38728a8b01df" containerName="extract-utilities" Nov 22 04:33:47 crc kubenswrapper[4699]: I1122 04:33:47.463093 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b836904-a497-47a0-b0ae-38728a8b01df" containerName="extract-utilities" Nov 22 04:33:47 crc kubenswrapper[4699]: E1122 04:33:47.463108 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b836904-a497-47a0-b0ae-38728a8b01df" containerName="extract-content" Nov 22 04:33:47 crc kubenswrapper[4699]: I1122 04:33:47.463116 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b836904-a497-47a0-b0ae-38728a8b01df" containerName="extract-content" Nov 22 04:33:47 crc kubenswrapper[4699]: I1122 04:33:47.463310 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b836904-a497-47a0-b0ae-38728a8b01df" containerName="registry-server" Nov 22 04:33:47 crc kubenswrapper[4699]: I1122 04:33:47.465073 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hkrq8" Nov 22 04:33:47 crc kubenswrapper[4699]: I1122 04:33:47.474154 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hkrq8"] Nov 22 04:33:47 crc kubenswrapper[4699]: I1122 04:33:47.495792 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d5cfa0f-e066-4919-901d-2bf01354e469-utilities\") pod \"certified-operators-hkrq8\" (UID: \"6d5cfa0f-e066-4919-901d-2bf01354e469\") " pod="openshift-marketplace/certified-operators-hkrq8" Nov 22 04:33:47 crc kubenswrapper[4699]: I1122 04:33:47.495915 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkf5p\" (UniqueName: \"kubernetes.io/projected/6d5cfa0f-e066-4919-901d-2bf01354e469-kube-api-access-vkf5p\") pod \"certified-operators-hkrq8\" (UID: \"6d5cfa0f-e066-4919-901d-2bf01354e469\") " pod="openshift-marketplace/certified-operators-hkrq8" Nov 22 04:33:47 crc kubenswrapper[4699]: I1122 04:33:47.496071 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d5cfa0f-e066-4919-901d-2bf01354e469-catalog-content\") pod \"certified-operators-hkrq8\" (UID: \"6d5cfa0f-e066-4919-901d-2bf01354e469\") " pod="openshift-marketplace/certified-operators-hkrq8" Nov 22 04:33:47 crc kubenswrapper[4699]: I1122 04:33:47.598721 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d5cfa0f-e066-4919-901d-2bf01354e469-utilities\") pod \"certified-operators-hkrq8\" (UID: \"6d5cfa0f-e066-4919-901d-2bf01354e469\") " pod="openshift-marketplace/certified-operators-hkrq8" Nov 22 04:33:47 crc kubenswrapper[4699]: I1122 04:33:47.599144 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkf5p\" (UniqueName: \"kubernetes.io/projected/6d5cfa0f-e066-4919-901d-2bf01354e469-kube-api-access-vkf5p\") pod \"certified-operators-hkrq8\" (UID: \"6d5cfa0f-e066-4919-901d-2bf01354e469\") " pod="openshift-marketplace/certified-operators-hkrq8" Nov 22 04:33:47 crc kubenswrapper[4699]: I1122 04:33:47.599169 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d5cfa0f-e066-4919-901d-2bf01354e469-catalog-content\") pod \"certified-operators-hkrq8\" (UID: \"6d5cfa0f-e066-4919-901d-2bf01354e469\") " pod="openshift-marketplace/certified-operators-hkrq8" Nov 22 04:33:47 crc kubenswrapper[4699]: I1122 04:33:47.599406 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d5cfa0f-e066-4919-901d-2bf01354e469-utilities\") pod \"certified-operators-hkrq8\" (UID: \"6d5cfa0f-e066-4919-901d-2bf01354e469\") " pod="openshift-marketplace/certified-operators-hkrq8" Nov 22 04:33:47 crc kubenswrapper[4699]: I1122 04:33:47.599724 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d5cfa0f-e066-4919-901d-2bf01354e469-catalog-content\") pod \"certified-operators-hkrq8\" (UID: \"6d5cfa0f-e066-4919-901d-2bf01354e469\") " pod="openshift-marketplace/certified-operators-hkrq8" Nov 22 04:33:47 crc kubenswrapper[4699]: I1122 04:33:47.626811 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkf5p\" (UniqueName: \"kubernetes.io/projected/6d5cfa0f-e066-4919-901d-2bf01354e469-kube-api-access-vkf5p\") pod \"certified-operators-hkrq8\" (UID: \"6d5cfa0f-e066-4919-901d-2bf01354e469\") " pod="openshift-marketplace/certified-operators-hkrq8" Nov 22 04:33:47 crc kubenswrapper[4699]: I1122 04:33:47.791131 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hkrq8" Nov 22 04:33:48 crc kubenswrapper[4699]: I1122 04:33:48.326926 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hkrq8"] Nov 22 04:33:48 crc kubenswrapper[4699]: W1122 04:33:48.348694 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d5cfa0f_e066_4919_901d_2bf01354e469.slice/crio-d8349636c71db98ad6b23a162cf81051748a3d73875742351ce53e874f7057c0 WatchSource:0}: Error finding container d8349636c71db98ad6b23a162cf81051748a3d73875742351ce53e874f7057c0: Status 404 returned error can't find the container with id d8349636c71db98ad6b23a162cf81051748a3d73875742351ce53e874f7057c0 Nov 22 04:33:48 crc kubenswrapper[4699]: I1122 04:33:48.651092 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hkrq8" event={"ID":"6d5cfa0f-e066-4919-901d-2bf01354e469","Type":"ContainerStarted","Data":"d8349636c71db98ad6b23a162cf81051748a3d73875742351ce53e874f7057c0"} Nov 22 04:33:49 crc kubenswrapper[4699]: I1122 04:33:49.668425 4699 generic.go:334] "Generic (PLEG): container finished" podID="6d5cfa0f-e066-4919-901d-2bf01354e469" containerID="005fdc0918e252cddca2619002e04bce50e4714243364d93b507d1f074e8d79d" exitCode=0 Nov 22 04:33:49 crc kubenswrapper[4699]: I1122 04:33:49.668510 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hkrq8" event={"ID":"6d5cfa0f-e066-4919-901d-2bf01354e469","Type":"ContainerDied","Data":"005fdc0918e252cddca2619002e04bce50e4714243364d93b507d1f074e8d79d"} Nov 22 04:33:49 crc kubenswrapper[4699]: I1122 04:33:49.671226 4699 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 04:33:51 crc kubenswrapper[4699]: I1122 04:33:51.690118 4699 generic.go:334] "Generic (PLEG): container finished" podID="6d5cfa0f-e066-4919-901d-2bf01354e469" containerID="9a27565e5e2bb6fadd23a12f473d789fa7eb5a44710fdc74b58abf8f17418df8" exitCode=0 Nov 22 04:33:51 crc kubenswrapper[4699]: I1122 04:33:51.690168 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hkrq8" event={"ID":"6d5cfa0f-e066-4919-901d-2bf01354e469","Type":"ContainerDied","Data":"9a27565e5e2bb6fadd23a12f473d789fa7eb5a44710fdc74b58abf8f17418df8"} Nov 22 04:33:52 crc kubenswrapper[4699]: I1122 04:33:52.704929 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hkrq8" event={"ID":"6d5cfa0f-e066-4919-901d-2bf01354e469","Type":"ContainerStarted","Data":"f5ac47d83b3092cbc163c46a6efb53a67a74c3e14980ee66b2d0ede619096a83"} Nov 22 04:33:52 crc kubenswrapper[4699]: I1122 04:33:52.738326 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hkrq8" podStartSLOduration=3.298392782 podStartE2EDuration="5.738305079s" podCreationTimestamp="2025-11-22 04:33:47 +0000 UTC" firstStartedPulling="2025-11-22 04:33:49.670931293 +0000 UTC m=+1581.013552480" lastFinishedPulling="2025-11-22 04:33:52.1108436 +0000 UTC m=+1583.453464777" observedRunningTime="2025-11-22 04:33:52.733313317 +0000 UTC m=+1584.075934514" watchObservedRunningTime="2025-11-22 04:33:52.738305079 +0000 UTC m=+1584.080926266" Nov 22 04:33:57 crc kubenswrapper[4699]: I1122 04:33:57.791990 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hkrq8" Nov 22 04:33:57 crc kubenswrapper[4699]: I1122 04:33:57.792670 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hkrq8" Nov 22 04:33:57 crc kubenswrapper[4699]: I1122 04:33:57.874980 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hkrq8" Nov 22 04:33:58 crc kubenswrapper[4699]: I1122 04:33:58.815749 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hkrq8" Nov 22 04:33:58 crc kubenswrapper[4699]: I1122 04:33:58.867532 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hkrq8"] Nov 22 04:34:00 crc kubenswrapper[4699]: I1122 04:34:00.775026 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hkrq8" podUID="6d5cfa0f-e066-4919-901d-2bf01354e469" containerName="registry-server" containerID="cri-o://f5ac47d83b3092cbc163c46a6efb53a67a74c3e14980ee66b2d0ede619096a83" gracePeriod=2 Nov 22 04:34:01 crc kubenswrapper[4699]: I1122 04:34:01.251126 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hkrq8" Nov 22 04:34:01 crc kubenswrapper[4699]: I1122 04:34:01.298854 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkf5p\" (UniqueName: \"kubernetes.io/projected/6d5cfa0f-e066-4919-901d-2bf01354e469-kube-api-access-vkf5p\") pod \"6d5cfa0f-e066-4919-901d-2bf01354e469\" (UID: \"6d5cfa0f-e066-4919-901d-2bf01354e469\") " Nov 22 04:34:01 crc kubenswrapper[4699]: I1122 04:34:01.316922 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d5cfa0f-e066-4919-901d-2bf01354e469-kube-api-access-vkf5p" (OuterVolumeSpecName: "kube-api-access-vkf5p") pod "6d5cfa0f-e066-4919-901d-2bf01354e469" (UID: "6d5cfa0f-e066-4919-901d-2bf01354e469"). InnerVolumeSpecName "kube-api-access-vkf5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:34:01 crc kubenswrapper[4699]: I1122 04:34:01.317680 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d5cfa0f-e066-4919-901d-2bf01354e469-catalog-content\") pod \"6d5cfa0f-e066-4919-901d-2bf01354e469\" (UID: \"6d5cfa0f-e066-4919-901d-2bf01354e469\") " Nov 22 04:34:01 crc kubenswrapper[4699]: I1122 04:34:01.318033 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d5cfa0f-e066-4919-901d-2bf01354e469-utilities\") pod \"6d5cfa0f-e066-4919-901d-2bf01354e469\" (UID: \"6d5cfa0f-e066-4919-901d-2bf01354e469\") " Nov 22 04:34:01 crc kubenswrapper[4699]: I1122 04:34:01.319315 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkf5p\" (UniqueName: \"kubernetes.io/projected/6d5cfa0f-e066-4919-901d-2bf01354e469-kube-api-access-vkf5p\") on node \"crc\" DevicePath \"\"" Nov 22 04:34:01 crc kubenswrapper[4699]: I1122 04:34:01.320777 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d5cfa0f-e066-4919-901d-2bf01354e469-utilities" (OuterVolumeSpecName: "utilities") pod "6d5cfa0f-e066-4919-901d-2bf01354e469" (UID: "6d5cfa0f-e066-4919-901d-2bf01354e469"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:34:01 crc kubenswrapper[4699]: I1122 04:34:01.355692 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d5cfa0f-e066-4919-901d-2bf01354e469-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6d5cfa0f-e066-4919-901d-2bf01354e469" (UID: "6d5cfa0f-e066-4919-901d-2bf01354e469"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:34:01 crc kubenswrapper[4699]: I1122 04:34:01.421696 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d5cfa0f-e066-4919-901d-2bf01354e469-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:34:01 crc kubenswrapper[4699]: I1122 04:34:01.421737 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d5cfa0f-e066-4919-901d-2bf01354e469-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:34:01 crc kubenswrapper[4699]: I1122 04:34:01.788682 4699 generic.go:334] "Generic (PLEG): container finished" podID="6d5cfa0f-e066-4919-901d-2bf01354e469" containerID="f5ac47d83b3092cbc163c46a6efb53a67a74c3e14980ee66b2d0ede619096a83" exitCode=0 Nov 22 04:34:01 crc kubenswrapper[4699]: I1122 04:34:01.788733 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hkrq8" event={"ID":"6d5cfa0f-e066-4919-901d-2bf01354e469","Type":"ContainerDied","Data":"f5ac47d83b3092cbc163c46a6efb53a67a74c3e14980ee66b2d0ede619096a83"} Nov 22 04:34:01 crc kubenswrapper[4699]: I1122 04:34:01.788769 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hkrq8" event={"ID":"6d5cfa0f-e066-4919-901d-2bf01354e469","Type":"ContainerDied","Data":"d8349636c71db98ad6b23a162cf81051748a3d73875742351ce53e874f7057c0"} Nov 22 04:34:01 crc kubenswrapper[4699]: I1122 04:34:01.788791 4699 scope.go:117] "RemoveContainer" containerID="f5ac47d83b3092cbc163c46a6efb53a67a74c3e14980ee66b2d0ede619096a83" Nov 22 04:34:01 crc kubenswrapper[4699]: I1122 04:34:01.790285 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hkrq8" Nov 22 04:34:01 crc kubenswrapper[4699]: I1122 04:34:01.840201 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hkrq8"] Nov 22 04:34:01 crc kubenswrapper[4699]: I1122 04:34:01.852250 4699 scope.go:117] "RemoveContainer" containerID="9a27565e5e2bb6fadd23a12f473d789fa7eb5a44710fdc74b58abf8f17418df8" Nov 22 04:34:01 crc kubenswrapper[4699]: I1122 04:34:01.864015 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hkrq8"] Nov 22 04:34:01 crc kubenswrapper[4699]: I1122 04:34:01.880367 4699 scope.go:117] "RemoveContainer" containerID="005fdc0918e252cddca2619002e04bce50e4714243364d93b507d1f074e8d79d" Nov 22 04:34:01 crc kubenswrapper[4699]: I1122 04:34:01.944636 4699 scope.go:117] "RemoveContainer" containerID="f5ac47d83b3092cbc163c46a6efb53a67a74c3e14980ee66b2d0ede619096a83" Nov 22 04:34:01 crc kubenswrapper[4699]: E1122 04:34:01.945722 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5ac47d83b3092cbc163c46a6efb53a67a74c3e14980ee66b2d0ede619096a83\": container with ID starting with f5ac47d83b3092cbc163c46a6efb53a67a74c3e14980ee66b2d0ede619096a83 not found: ID does not exist" containerID="f5ac47d83b3092cbc163c46a6efb53a67a74c3e14980ee66b2d0ede619096a83" Nov 22 04:34:01 crc kubenswrapper[4699]: I1122 04:34:01.945865 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5ac47d83b3092cbc163c46a6efb53a67a74c3e14980ee66b2d0ede619096a83"} err="failed to get container status \"f5ac47d83b3092cbc163c46a6efb53a67a74c3e14980ee66b2d0ede619096a83\": rpc error: code = NotFound desc = could not find container \"f5ac47d83b3092cbc163c46a6efb53a67a74c3e14980ee66b2d0ede619096a83\": container with ID starting with f5ac47d83b3092cbc163c46a6efb53a67a74c3e14980ee66b2d0ede619096a83 not found: ID does not exist" Nov 22 04:34:01 crc kubenswrapper[4699]: I1122 04:34:01.945950 4699 scope.go:117] "RemoveContainer" containerID="9a27565e5e2bb6fadd23a12f473d789fa7eb5a44710fdc74b58abf8f17418df8" Nov 22 04:34:01 crc kubenswrapper[4699]: E1122 04:34:01.946312 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a27565e5e2bb6fadd23a12f473d789fa7eb5a44710fdc74b58abf8f17418df8\": container with ID starting with 9a27565e5e2bb6fadd23a12f473d789fa7eb5a44710fdc74b58abf8f17418df8 not found: ID does not exist" containerID="9a27565e5e2bb6fadd23a12f473d789fa7eb5a44710fdc74b58abf8f17418df8" Nov 22 04:34:01 crc kubenswrapper[4699]: I1122 04:34:01.946347 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a27565e5e2bb6fadd23a12f473d789fa7eb5a44710fdc74b58abf8f17418df8"} err="failed to get container status \"9a27565e5e2bb6fadd23a12f473d789fa7eb5a44710fdc74b58abf8f17418df8\": rpc error: code = NotFound desc = could not find container \"9a27565e5e2bb6fadd23a12f473d789fa7eb5a44710fdc74b58abf8f17418df8\": container with ID starting with 9a27565e5e2bb6fadd23a12f473d789fa7eb5a44710fdc74b58abf8f17418df8 not found: ID does not exist" Nov 22 04:34:01 crc kubenswrapper[4699]: I1122 04:34:01.946366 4699 scope.go:117] "RemoveContainer" containerID="005fdc0918e252cddca2619002e04bce50e4714243364d93b507d1f074e8d79d" Nov 22 04:34:01 crc kubenswrapper[4699]: E1122 04:34:01.946642 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"005fdc0918e252cddca2619002e04bce50e4714243364d93b507d1f074e8d79d\": container with ID starting with 005fdc0918e252cddca2619002e04bce50e4714243364d93b507d1f074e8d79d not found: ID does not exist" containerID="005fdc0918e252cddca2619002e04bce50e4714243364d93b507d1f074e8d79d" Nov 22 04:34:01 crc kubenswrapper[4699]: I1122 04:34:01.946724 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"005fdc0918e252cddca2619002e04bce50e4714243364d93b507d1f074e8d79d"} err="failed to get container status \"005fdc0918e252cddca2619002e04bce50e4714243364d93b507d1f074e8d79d\": rpc error: code = NotFound desc = could not find container \"005fdc0918e252cddca2619002e04bce50e4714243364d93b507d1f074e8d79d\": container with ID starting with 005fdc0918e252cddca2619002e04bce50e4714243364d93b507d1f074e8d79d not found: ID does not exist" Nov 22 04:34:03 crc kubenswrapper[4699]: I1122 04:34:03.460520 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d5cfa0f-e066-4919-901d-2bf01354e469" path="/var/lib/kubelet/pods/6d5cfa0f-e066-4919-901d-2bf01354e469/volumes" Nov 22 04:34:32 crc kubenswrapper[4699]: I1122 04:34:32.262665 4699 scope.go:117] "RemoveContainer" containerID="7e1fcbb58973272b444fef351419c088db0c22f36b8210be107197b7f9bc8eaa" Nov 22 04:34:32 crc kubenswrapper[4699]: I1122 04:34:32.293673 4699 scope.go:117] "RemoveContainer" containerID="93774c3e979cfc6002da894382cdd76ae9a2684d25c31d0ab44f93df7f619464" Nov 22 04:34:38 crc kubenswrapper[4699]: I1122 04:34:38.726003 4699 patch_prober.go:28] interesting pod/machine-config-daemon-kjwnt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:34:38 crc kubenswrapper[4699]: I1122 04:34:38.726803 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:35:08 crc kubenswrapper[4699]: I1122 04:35:08.726635 4699 patch_prober.go:28] interesting pod/machine-config-daemon-kjwnt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:35:08 crc kubenswrapper[4699]: I1122 04:35:08.727383 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:35:37 crc kubenswrapper[4699]: I1122 04:35:37.039939 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-p6fhv"] Nov 22 04:35:37 crc kubenswrapper[4699]: I1122 04:35:37.053512 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-1642-account-create-ht5w7"] Nov 22 04:35:37 crc kubenswrapper[4699]: I1122 04:35:37.062852 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-1642-account-create-ht5w7"] Nov 22 04:35:37 crc kubenswrapper[4699]: I1122 04:35:37.074089 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-p6fhv"] Nov 22 04:35:37 crc kubenswrapper[4699]: I1122 04:35:37.465268 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e3fa899-c823-4cab-8224-1ca3130f515a" path="/var/lib/kubelet/pods/9e3fa899-c823-4cab-8224-1ca3130f515a/volumes" Nov 22 04:35:37 crc kubenswrapper[4699]: I1122 04:35:37.466536 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c99675e1-93f6-4b73-b4cb-e8f096c3c16e" path="/var/lib/kubelet/pods/c99675e1-93f6-4b73-b4cb-e8f096c3c16e/volumes" Nov 22 04:35:38 crc kubenswrapper[4699]: I1122 04:35:38.726794 4699 patch_prober.go:28] interesting pod/machine-config-daemon-kjwnt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:35:38 crc kubenswrapper[4699]: I1122 04:35:38.727263 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:35:38 crc kubenswrapper[4699]: I1122 04:35:38.727364 4699 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" Nov 22 04:35:38 crc kubenswrapper[4699]: I1122 04:35:38.728181 4699 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9f6c8a2daef4dc5617a6b47fc5d58598238dea049bba2ad09b65bd85f946e581"} pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 04:35:38 crc kubenswrapper[4699]: I1122 04:35:38.728246 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerName="machine-config-daemon" containerID="cri-o://9f6c8a2daef4dc5617a6b47fc5d58598238dea049bba2ad09b65bd85f946e581" gracePeriod=600 Nov 22 04:35:38 crc kubenswrapper[4699]: E1122 04:35:38.852303 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kjwnt_openshift-machine-config-operator(41bdbae2-706a-4f84-9f56-5a42aec77762)\"" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" Nov 22 04:35:39 crc kubenswrapper[4699]: I1122 04:35:39.307493 4699 generic.go:334] "Generic (PLEG): container finished" podID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerID="9f6c8a2daef4dc5617a6b47fc5d58598238dea049bba2ad09b65bd85f946e581" exitCode=0 Nov 22 04:35:39 crc kubenswrapper[4699]: I1122 04:35:39.307556 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" event={"ID":"41bdbae2-706a-4f84-9f56-5a42aec77762","Type":"ContainerDied","Data":"9f6c8a2daef4dc5617a6b47fc5d58598238dea049bba2ad09b65bd85f946e581"} Nov 22 04:35:39 crc kubenswrapper[4699]: I1122 04:35:39.307622 4699 scope.go:117] "RemoveContainer" containerID="f9b23a2370657a76cf1f4f279dceac7c7bb8c31dc2586215719f3f3336390722" Nov 22 04:35:39 crc kubenswrapper[4699]: I1122 04:35:39.308354 4699 scope.go:117] "RemoveContainer" containerID="9f6c8a2daef4dc5617a6b47fc5d58598238dea049bba2ad09b65bd85f946e581" Nov 22 04:35:39 crc kubenswrapper[4699]: E1122 04:35:39.308647 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kjwnt_openshift-machine-config-operator(41bdbae2-706a-4f84-9f56-5a42aec77762)\"" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" Nov 22 04:35:40 crc kubenswrapper[4699]: I1122 04:35:40.024605 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-ck2mh"] Nov 22 04:35:40 crc kubenswrapper[4699]: I1122 04:35:40.034550 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-ck2mh"] Nov 22 04:35:41 crc kubenswrapper[4699]: I1122 04:35:41.029324 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-81ad-account-create-jflt8"] Nov 22 04:35:41 crc kubenswrapper[4699]: I1122 04:35:41.038851 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-f2d9-account-create-9fnts"] Nov 22 04:35:41 crc kubenswrapper[4699]: I1122 04:35:41.050568 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-f2d9-account-create-9fnts"] Nov 22 04:35:41 crc kubenswrapper[4699]: I1122 04:35:41.058392 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-mnjhh"] Nov 22 04:35:41 crc kubenswrapper[4699]: I1122 04:35:41.065795 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-81ad-account-create-jflt8"] Nov 22 04:35:41 crc kubenswrapper[4699]: I1122 04:35:41.073047 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-mnjhh"] Nov 22 04:35:41 crc kubenswrapper[4699]: I1122 04:35:41.460172 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73de98eb-db4a-47f1-b23a-aa38b2db9078" path="/var/lib/kubelet/pods/73de98eb-db4a-47f1-b23a-aa38b2db9078/volumes" Nov 22 04:35:41 crc kubenswrapper[4699]: I1122 04:35:41.461130 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8070964f-baae-4437-b0b7-2ff91608f0d7" path="/var/lib/kubelet/pods/8070964f-baae-4437-b0b7-2ff91608f0d7/volumes" Nov 22 04:35:41 crc kubenswrapper[4699]: I1122 04:35:41.461811 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8afe223c-55c0-40b3-aa14-ea52cad6bccc" path="/var/lib/kubelet/pods/8afe223c-55c0-40b3-aa14-ea52cad6bccc/volumes" Nov 22 04:35:41 crc kubenswrapper[4699]: I1122 04:35:41.462506 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe7d25aa-4c77-48b6-88fe-11339dbca63a" path="/var/lib/kubelet/pods/fe7d25aa-4c77-48b6-88fe-11339dbca63a/volumes" Nov 22 04:35:54 crc kubenswrapper[4699]: I1122 04:35:54.447687 4699 scope.go:117] "RemoveContainer" containerID="9f6c8a2daef4dc5617a6b47fc5d58598238dea049bba2ad09b65bd85f946e581" Nov 22 04:35:54 crc kubenswrapper[4699]: E1122 04:35:54.448509 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kjwnt_openshift-machine-config-operator(41bdbae2-706a-4f84-9f56-5a42aec77762)\"" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" Nov 22 04:36:05 crc kubenswrapper[4699]: I1122 04:36:05.448017 4699 scope.go:117] "RemoveContainer" containerID="9f6c8a2daef4dc5617a6b47fc5d58598238dea049bba2ad09b65bd85f946e581" Nov 22 04:36:05 crc kubenswrapper[4699]: E1122 04:36:05.448850 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kjwnt_openshift-machine-config-operator(41bdbae2-706a-4f84-9f56-5a42aec77762)\"" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" Nov 22 04:36:08 crc kubenswrapper[4699]: I1122 04:36:08.036799 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-rtkln"] Nov 22 04:36:08 crc kubenswrapper[4699]: I1122 04:36:08.045158 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-rtkln"] Nov 22 04:36:09 crc kubenswrapper[4699]: I1122 04:36:09.474743 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="feebe11e-01d4-44f9-a95d-9b35d3162cfd" path="/var/lib/kubelet/pods/feebe11e-01d4-44f9-a95d-9b35d3162cfd/volumes" Nov 22 04:36:20 crc kubenswrapper[4699]: I1122 04:36:20.447339 4699 scope.go:117] "RemoveContainer" containerID="9f6c8a2daef4dc5617a6b47fc5d58598238dea049bba2ad09b65bd85f946e581" Nov 22 04:36:20 crc kubenswrapper[4699]: E1122 04:36:20.448150 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kjwnt_openshift-machine-config-operator(41bdbae2-706a-4f84-9f56-5a42aec77762)\"" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" Nov 22 04:36:21 crc kubenswrapper[4699]: I1122 04:36:21.040499 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-919f-account-create-zkhx4"] Nov 22 04:36:21 crc kubenswrapper[4699]: I1122 04:36:21.052872 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-c223-account-create-gfsvr"] Nov 22 04:36:21 crc kubenswrapper[4699]: I1122 04:36:21.062058 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-jrsxw"] Nov 22 04:36:21 crc kubenswrapper[4699]: I1122 04:36:21.070565 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-919f-account-create-zkhx4"] Nov 22 04:36:21 crc kubenswrapper[4699]: I1122 04:36:21.078490 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-jrsxw"] Nov 22 04:36:21 crc kubenswrapper[4699]: I1122 04:36:21.085237 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-6kgwt"] Nov 22 04:36:21 crc kubenswrapper[4699]: I1122 04:36:21.091943 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-c223-account-create-gfsvr"] Nov 22 04:36:21 crc kubenswrapper[4699]: I1122 04:36:21.098509 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5ac7-account-create-jmn9n"] Nov 22 04:36:21 crc kubenswrapper[4699]: I1122 04:36:21.105887 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-xj7gh"] Nov 22 04:36:21 crc kubenswrapper[4699]: I1122 04:36:21.114426 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-xj7gh"] Nov 22 04:36:21 crc kubenswrapper[4699]: I1122 04:36:21.123185 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5ac7-account-create-jmn9n"] Nov 22 04:36:21 crc kubenswrapper[4699]: I1122 04:36:21.133481 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-6kgwt"] Nov 22 04:36:21 crc kubenswrapper[4699]: I1122 04:36:21.461963 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="479c04e1-21fc-4674-98ac-3abc9ba96b34" path="/var/lib/kubelet/pods/479c04e1-21fc-4674-98ac-3abc9ba96b34/volumes" Nov 22 04:36:21 crc kubenswrapper[4699]: I1122 04:36:21.464326 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67126a3c-ec10-4f12-96ad-0133fcabb75f" path="/var/lib/kubelet/pods/67126a3c-ec10-4f12-96ad-0133fcabb75f/volumes" Nov 22 04:36:21 crc kubenswrapper[4699]: I1122 04:36:21.465388 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7817603d-cfdf-425d-84de-095ff5b5674e" path="/var/lib/kubelet/pods/7817603d-cfdf-425d-84de-095ff5b5674e/volumes" Nov 22 04:36:21 crc kubenswrapper[4699]: I1122 04:36:21.466565 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96c287a2-30d6-4055-aea8-d104dbd472c2" path="/var/lib/kubelet/pods/96c287a2-30d6-4055-aea8-d104dbd472c2/volumes" Nov 22 04:36:21 crc kubenswrapper[4699]: I1122 04:36:21.467535 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddb8d79c-fba7-489b-8953-f507deb03a03" path="/var/lib/kubelet/pods/ddb8d79c-fba7-489b-8953-f507deb03a03/volumes" Nov 22 04:36:21 crc kubenswrapper[4699]: I1122 04:36:21.468254 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ece646f4-4f02-4a0d-9dde-ffb3d913410e" path="/var/lib/kubelet/pods/ece646f4-4f02-4a0d-9dde-ffb3d913410e/volumes" Nov 22 04:36:32 crc kubenswrapper[4699]: I1122 04:36:32.391010 4699 scope.go:117] "RemoveContainer" containerID="1f0da47f8055f67e2b11ad450c78bd219e11f2e4b3dd06a155e2031ab3f1d002" Nov 22 04:36:32 crc kubenswrapper[4699]: I1122 04:36:32.445847 4699 scope.go:117] "RemoveContainer" containerID="c34babc09fa972896b17116ea685f90d6a7b38dce92f5a6f8a3fd517e7d2fa98" Nov 22 04:36:32 crc kubenswrapper[4699]: I1122 04:36:32.448353 4699 scope.go:117] "RemoveContainer" containerID="9f6c8a2daef4dc5617a6b47fc5d58598238dea049bba2ad09b65bd85f946e581" Nov 22 04:36:32 crc kubenswrapper[4699]: E1122 04:36:32.449116 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kjwnt_openshift-machine-config-operator(41bdbae2-706a-4f84-9f56-5a42aec77762)\"" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" Nov 22 04:36:32 crc kubenswrapper[4699]: I1122 04:36:32.479486 4699 scope.go:117] "RemoveContainer" containerID="25797f2bcddd3b7b79c299632af5532e867769ad77643c3195487f0fc0e5ffac" Nov 22 04:36:32 crc kubenswrapper[4699]: I1122 04:36:32.520501 4699 scope.go:117] "RemoveContainer" containerID="50684eda9651e67a2aa904aa2ecb7f8fd420565e07461db74e9efca2b0530472" Nov 22 04:36:32 crc kubenswrapper[4699]: I1122 04:36:32.583990 4699 scope.go:117] "RemoveContainer" containerID="315df20183e7724f4f44f825f46816d1c0546d85f93272ae82319dcc710d1f28" Nov 22 04:36:32 crc kubenswrapper[4699]: I1122 04:36:32.623117 4699 scope.go:117] "RemoveContainer" containerID="c9e75a351eef56340e30a3fd159bb197d4d0e4720077044dccb87ea5eabdb207" Nov 22 04:36:32 crc kubenswrapper[4699]: I1122 04:36:32.675808 4699 scope.go:117] "RemoveContainer" containerID="98c064a9afd4a919e21ad54c156ab613a1bd117554fc82d598efa73401984447" Nov 22 04:36:32 crc kubenswrapper[4699]: I1122 04:36:32.697382 4699 scope.go:117] "RemoveContainer" containerID="dffbde150b14f2f342df64f5c7e4cfd56ab88a5e00d59a77ba5240fb57ce1fa3" Nov 22 04:36:32 crc kubenswrapper[4699]: I1122 04:36:32.720730 4699 scope.go:117] "RemoveContainer" containerID="8b6ba740e45459f2bdbb9721045729d9604eddcf85137656d66d2083a581dbfb" Nov 22 04:36:32 crc kubenswrapper[4699]: I1122 04:36:32.743013 4699 scope.go:117] "RemoveContainer" containerID="8962410891dbd7d7cd624bd791f4233fad8ba12296a4b271e51cd53b7a12e51b" Nov 22 04:36:32 crc kubenswrapper[4699]: I1122 04:36:32.764216 4699 scope.go:117] "RemoveContainer" containerID="ef8020ddface36a37b2c7f54d500c076f405aba4a7d91d37d05a3debfe9dc74c" Nov 22 04:36:32 crc kubenswrapper[4699]: I1122 04:36:32.784239 4699 scope.go:117] "RemoveContainer" containerID="03cac1d004eeb36b5f78d4c545557cca1039054d9489fe9752da173743c5aaf6" Nov 22 04:36:32 crc kubenswrapper[4699]: I1122 04:36:32.811633 4699 scope.go:117] "RemoveContainer" containerID="0bb047593a6b282bfac14d88287ef8f339b80ee2b324fc6cd2220f5fcb8cf7da" Nov 22 04:36:32 crc kubenswrapper[4699]: I1122 04:36:32.855677 4699 scope.go:117] "RemoveContainer" containerID="7fe951a8339cdd4c0b84481d3d38e606bee6aae754cefa3921aea9b07129c6fc" Nov 22 04:36:32 crc kubenswrapper[4699]: I1122 04:36:32.884221 4699 scope.go:117] "RemoveContainer" containerID="4c2ddb86338033d36a854e6a597e4f32f3f958d00f97a4e665fe3ed6fb5b5942" Nov 22 04:36:32 crc kubenswrapper[4699]: I1122 04:36:32.932478 4699 scope.go:117] "RemoveContainer" containerID="d583b2f70707a4953393b4533486d274bb9897646119ce4c6029c87a6316f924" Nov 22 04:36:33 crc kubenswrapper[4699]: I1122 04:36:33.031406 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-8xlt7"] Nov 22 04:36:33 crc kubenswrapper[4699]: I1122 04:36:33.044994 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-8xlt7"] Nov 22 04:36:33 crc kubenswrapper[4699]: I1122 04:36:33.461087 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34007e0a-511e-41f1-b3fc-810d7911d11c" path="/var/lib/kubelet/pods/34007e0a-511e-41f1-b3fc-810d7911d11c/volumes" Nov 22 04:36:44 crc kubenswrapper[4699]: I1122 04:36:44.027205 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-afdd-account-create-6x2nj"] Nov 22 04:36:44 crc kubenswrapper[4699]: I1122 04:36:44.039302 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-db-create-plg9d"] Nov 22 04:36:44 crc kubenswrapper[4699]: I1122 04:36:44.046816 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-afdd-account-create-6x2nj"] Nov 22 04:36:44 crc kubenswrapper[4699]: I1122 04:36:44.057274 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-db-create-plg9d"] Nov 22 04:36:45 crc kubenswrapper[4699]: I1122 04:36:45.464235 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cb04579-e9a4-4139-8b94-4ce96f466397" path="/var/lib/kubelet/pods/0cb04579-e9a4-4139-8b94-4ce96f466397/volumes" Nov 22 04:36:45 crc kubenswrapper[4699]: I1122 04:36:45.465185 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4040846b-26da-4123-b778-0115b8c5e6da" path="/var/lib/kubelet/pods/4040846b-26da-4123-b778-0115b8c5e6da/volumes" Nov 22 04:36:47 crc kubenswrapper[4699]: I1122 04:36:47.449196 4699 scope.go:117] "RemoveContainer" containerID="9f6c8a2daef4dc5617a6b47fc5d58598238dea049bba2ad09b65bd85f946e581" Nov 22 04:36:47 crc kubenswrapper[4699]: E1122 04:36:47.449775 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kjwnt_openshift-machine-config-operator(41bdbae2-706a-4f84-9f56-5a42aec77762)\"" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" Nov 22 04:36:58 crc kubenswrapper[4699]: I1122 04:36:58.447777 4699 scope.go:117] "RemoveContainer" containerID="9f6c8a2daef4dc5617a6b47fc5d58598238dea049bba2ad09b65bd85f946e581" Nov 22 04:36:58 crc kubenswrapper[4699]: E1122 04:36:58.448655 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kjwnt_openshift-machine-config-operator(41bdbae2-706a-4f84-9f56-5a42aec77762)\"" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" Nov 22 04:37:13 crc kubenswrapper[4699]: I1122 04:37:13.447656 4699 scope.go:117] "RemoveContainer" containerID="9f6c8a2daef4dc5617a6b47fc5d58598238dea049bba2ad09b65bd85f946e581" Nov 22 04:37:13 crc kubenswrapper[4699]: E1122 04:37:13.448646 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kjwnt_openshift-machine-config-operator(41bdbae2-706a-4f84-9f56-5a42aec77762)\"" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" Nov 22 04:37:27 crc kubenswrapper[4699]: I1122 04:37:27.448336 4699 scope.go:117] "RemoveContainer" containerID="9f6c8a2daef4dc5617a6b47fc5d58598238dea049bba2ad09b65bd85f946e581" Nov 22 04:37:27 crc kubenswrapper[4699]: E1122 04:37:27.449216 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kjwnt_openshift-machine-config-operator(41bdbae2-706a-4f84-9f56-5a42aec77762)\"" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" Nov 22 04:37:33 crc kubenswrapper[4699]: I1122 04:37:33.250639 4699 scope.go:117] "RemoveContainer" containerID="7857f769493c994617491a9656ac2379522e640053b0b678286edddd2d776b2e" Nov 22 04:37:33 crc kubenswrapper[4699]: I1122 04:37:33.289705 4699 scope.go:117] "RemoveContainer" containerID="805f75393a4c3c6f8b4816ed5752ac5eff8ef1a971274d36c33b43ff56076633" Nov 22 04:37:33 crc kubenswrapper[4699]: I1122 04:37:33.343150 4699 scope.go:117] "RemoveContainer" containerID="5af0e9b3d6eb36d09b86a7f1b03bd2813edaffcae4977d0f097f41890f77b98d" Nov 22 04:37:38 crc kubenswrapper[4699]: I1122 04:37:38.448203 4699 scope.go:117] "RemoveContainer" containerID="9f6c8a2daef4dc5617a6b47fc5d58598238dea049bba2ad09b65bd85f946e581" Nov 22 04:37:38 crc kubenswrapper[4699]: E1122 04:37:38.448963 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kjwnt_openshift-machine-config-operator(41bdbae2-706a-4f84-9f56-5a42aec77762)\"" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" Nov 22 04:37:44 crc kubenswrapper[4699]: I1122 04:37:44.050419 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-5bqjq"] Nov 22 04:37:44 crc kubenswrapper[4699]: I1122 04:37:44.064669 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-5bqjq"] Nov 22 04:37:45 crc kubenswrapper[4699]: I1122 04:37:45.464286 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae96d89d-006a-4e7c-a42b-916cc7c77d19" path="/var/lib/kubelet/pods/ae96d89d-006a-4e7c-a42b-916cc7c77d19/volumes" Nov 22 04:37:50 crc kubenswrapper[4699]: I1122 04:37:50.448733 4699 scope.go:117] "RemoveContainer" containerID="9f6c8a2daef4dc5617a6b47fc5d58598238dea049bba2ad09b65bd85f946e581" Nov 22 04:37:50 crc kubenswrapper[4699]: E1122 04:37:50.449492 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kjwnt_openshift-machine-config-operator(41bdbae2-706a-4f84-9f56-5a42aec77762)\"" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" Nov 22 04:37:53 crc kubenswrapper[4699]: I1122 04:37:53.043925 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-zb5vb"] Nov 22 04:37:53 crc kubenswrapper[4699]: I1122 04:37:53.053279 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-zb5vb"] Nov 22 04:37:53 crc kubenswrapper[4699]: I1122 04:37:53.459667 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b" path="/var/lib/kubelet/pods/3a2587fb-ffcf-4c6c-9cfa-c97adc04aa1b/volumes" Nov 22 04:37:57 crc kubenswrapper[4699]: I1122 04:37:57.026863 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-mf25h"] Nov 22 04:37:57 crc kubenswrapper[4699]: I1122 04:37:57.037013 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-mf25h"] Nov 22 04:37:57 crc kubenswrapper[4699]: I1122 04:37:57.460362 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a79a788b-1b1c-45df-9c90-3c30d382691b" path="/var/lib/kubelet/pods/a79a788b-1b1c-45df-9c90-3c30d382691b/volumes" Nov 22 04:38:01 crc kubenswrapper[4699]: I1122 04:38:01.448334 4699 scope.go:117] "RemoveContainer" containerID="9f6c8a2daef4dc5617a6b47fc5d58598238dea049bba2ad09b65bd85f946e581" Nov 22 04:38:01 crc kubenswrapper[4699]: E1122 04:38:01.449207 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kjwnt_openshift-machine-config-operator(41bdbae2-706a-4f84-9f56-5a42aec77762)\"" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" Nov 22 04:38:05 crc kubenswrapper[4699]: I1122 04:38:05.044304 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-fgx2c"] Nov 22 04:38:05 crc kubenswrapper[4699]: I1122 04:38:05.057601 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-fgx2c"] Nov 22 04:38:05 crc kubenswrapper[4699]: I1122 04:38:05.458383 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2442edb-5370-4fd9-af87-6cb17498cee6" path="/var/lib/kubelet/pods/a2442edb-5370-4fd9-af87-6cb17498cee6/volumes" Nov 22 04:38:13 crc kubenswrapper[4699]: I1122 04:38:13.049639 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-dhclj"] Nov 22 04:38:13 crc kubenswrapper[4699]: I1122 04:38:13.059099 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-dhclj"] Nov 22 04:38:13 crc kubenswrapper[4699]: I1122 04:38:13.459803 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c7883b3-956a-412b-87b7-f7366042440b" path="/var/lib/kubelet/pods/5c7883b3-956a-412b-87b7-f7366042440b/volumes" Nov 22 04:38:15 crc kubenswrapper[4699]: I1122 04:38:15.448568 4699 scope.go:117] "RemoveContainer" containerID="9f6c8a2daef4dc5617a6b47fc5d58598238dea049bba2ad09b65bd85f946e581" Nov 22 04:38:15 crc kubenswrapper[4699]: E1122 04:38:15.450361 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kjwnt_openshift-machine-config-operator(41bdbae2-706a-4f84-9f56-5a42aec77762)\"" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" Nov 22 04:38:21 crc kubenswrapper[4699]: I1122 04:38:21.038738 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-b1b0-account-create-zfw5l"] Nov 22 04:38:21 crc kubenswrapper[4699]: I1122 04:38:21.048922 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-db-create-c94bq"] Nov 22 04:38:21 crc kubenswrapper[4699]: I1122 04:38:21.060548 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-inspector-b1b0-account-create-zfw5l"] Nov 22 04:38:21 crc kubenswrapper[4699]: I1122 04:38:21.068932 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-inspector-db-create-c94bq"] Nov 22 04:38:21 crc kubenswrapper[4699]: I1122 04:38:21.463725 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e1b87a4-8a2e-4d69-b940-39df820a2a61" path="/var/lib/kubelet/pods/0e1b87a4-8a2e-4d69-b940-39df820a2a61/volumes" Nov 22 04:38:21 crc kubenswrapper[4699]: I1122 04:38:21.465420 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad04e92a-8275-4123-8b21-384b2f56cc3b" path="/var/lib/kubelet/pods/ad04e92a-8275-4123-8b21-384b2f56cc3b/volumes" Nov 22 04:38:27 crc kubenswrapper[4699]: I1122 04:38:27.447731 4699 scope.go:117] "RemoveContainer" containerID="9f6c8a2daef4dc5617a6b47fc5d58598238dea049bba2ad09b65bd85f946e581" Nov 22 04:38:27 crc kubenswrapper[4699]: E1122 04:38:27.448380 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kjwnt_openshift-machine-config-operator(41bdbae2-706a-4f84-9f56-5a42aec77762)\"" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" Nov 22 04:38:28 crc kubenswrapper[4699]: I1122 04:38:28.153108 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n6pxx/must-gather-6wzzd"] Nov 22 04:38:28 crc kubenswrapper[4699]: E1122 04:38:28.154153 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d5cfa0f-e066-4919-901d-2bf01354e469" containerName="extract-utilities" Nov 22 04:38:28 crc kubenswrapper[4699]: I1122 04:38:28.154181 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d5cfa0f-e066-4919-901d-2bf01354e469" containerName="extract-utilities" Nov 22 04:38:28 crc kubenswrapper[4699]: E1122 04:38:28.154205 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d5cfa0f-e066-4919-901d-2bf01354e469" containerName="registry-server" Nov 22 04:38:28 crc kubenswrapper[4699]: I1122 04:38:28.154215 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d5cfa0f-e066-4919-901d-2bf01354e469" containerName="registry-server" Nov 22 04:38:28 crc kubenswrapper[4699]: E1122 04:38:28.154232 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d5cfa0f-e066-4919-901d-2bf01354e469" containerName="extract-content" Nov 22 04:38:28 crc kubenswrapper[4699]: I1122 04:38:28.154243 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d5cfa0f-e066-4919-901d-2bf01354e469" containerName="extract-content" Nov 22 04:38:28 crc kubenswrapper[4699]: I1122 04:38:28.154540 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d5cfa0f-e066-4919-901d-2bf01354e469" containerName="registry-server" Nov 22 04:38:28 crc kubenswrapper[4699]: I1122 04:38:28.155975 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n6pxx/must-gather-6wzzd" Nov 22 04:38:28 crc kubenswrapper[4699]: I1122 04:38:28.158497 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-n6pxx"/"openshift-service-ca.crt" Nov 22 04:38:28 crc kubenswrapper[4699]: I1122 04:38:28.158697 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-n6pxx"/"default-dockercfg-sl6kd" Nov 22 04:38:28 crc kubenswrapper[4699]: I1122 04:38:28.158704 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-n6pxx"/"kube-root-ca.crt" Nov 22 04:38:28 crc kubenswrapper[4699]: I1122 04:38:28.166225 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-n6pxx/must-gather-6wzzd"] Nov 22 04:38:28 crc kubenswrapper[4699]: I1122 04:38:28.315658 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/79e77b1e-0b23-42b6-a491-d15ace6ebcac-must-gather-output\") pod \"must-gather-6wzzd\" (UID: \"79e77b1e-0b23-42b6-a491-d15ace6ebcac\") " pod="openshift-must-gather-n6pxx/must-gather-6wzzd" Nov 22 04:38:28 crc kubenswrapper[4699]: I1122 04:38:28.315722 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n28hg\" (UniqueName: \"kubernetes.io/projected/79e77b1e-0b23-42b6-a491-d15ace6ebcac-kube-api-access-n28hg\") pod \"must-gather-6wzzd\" (UID: \"79e77b1e-0b23-42b6-a491-d15ace6ebcac\") " pod="openshift-must-gather-n6pxx/must-gather-6wzzd" Nov 22 04:38:28 crc kubenswrapper[4699]: I1122 04:38:28.418910 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/79e77b1e-0b23-42b6-a491-d15ace6ebcac-must-gather-output\") pod \"must-gather-6wzzd\" (UID: \"79e77b1e-0b23-42b6-a491-d15ace6ebcac\") " pod="openshift-must-gather-n6pxx/must-gather-6wzzd" Nov 22 04:38:28 crc kubenswrapper[4699]: I1122 04:38:28.418988 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n28hg\" (UniqueName: \"kubernetes.io/projected/79e77b1e-0b23-42b6-a491-d15ace6ebcac-kube-api-access-n28hg\") pod \"must-gather-6wzzd\" (UID: \"79e77b1e-0b23-42b6-a491-d15ace6ebcac\") " pod="openshift-must-gather-n6pxx/must-gather-6wzzd" Nov 22 04:38:28 crc kubenswrapper[4699]: I1122 04:38:28.419669 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/79e77b1e-0b23-42b6-a491-d15ace6ebcac-must-gather-output\") pod \"must-gather-6wzzd\" (UID: \"79e77b1e-0b23-42b6-a491-d15ace6ebcac\") " pod="openshift-must-gather-n6pxx/must-gather-6wzzd" Nov 22 04:38:28 crc kubenswrapper[4699]: I1122 04:38:28.446177 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n28hg\" (UniqueName: \"kubernetes.io/projected/79e77b1e-0b23-42b6-a491-d15ace6ebcac-kube-api-access-n28hg\") pod \"must-gather-6wzzd\" (UID: \"79e77b1e-0b23-42b6-a491-d15ace6ebcac\") " pod="openshift-must-gather-n6pxx/must-gather-6wzzd" Nov 22 04:38:28 crc kubenswrapper[4699]: I1122 04:38:28.481228 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n6pxx/must-gather-6wzzd" Nov 22 04:38:28 crc kubenswrapper[4699]: I1122 04:38:28.791029 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-n6pxx/must-gather-6wzzd"] Nov 22 04:38:28 crc kubenswrapper[4699]: I1122 04:38:28.955236 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n6pxx/must-gather-6wzzd" event={"ID":"79e77b1e-0b23-42b6-a491-d15ace6ebcac","Type":"ContainerStarted","Data":"5fd670517359afe2bdfbe0259c68877a7792832f16118bea18a77ec88b79e2c0"} Nov 22 04:38:33 crc kubenswrapper[4699]: I1122 04:38:33.451300 4699 scope.go:117] "RemoveContainer" containerID="96132b23ec10daaf82debb2041fe6a9acbd17fb95fd2543bf6497612064673cd" Nov 22 04:38:35 crc kubenswrapper[4699]: I1122 04:38:35.037299 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-405a-account-create-grbrm"] Nov 22 04:38:35 crc kubenswrapper[4699]: I1122 04:38:35.044273 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-405a-account-create-grbrm"] Nov 22 04:38:35 crc kubenswrapper[4699]: I1122 04:38:35.459091 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c24c357-369d-430b-a7ba-62783ed79d1f" path="/var/lib/kubelet/pods/4c24c357-369d-430b-a7ba-62783ed79d1f/volumes" Nov 22 04:38:35 crc kubenswrapper[4699]: I1122 04:38:35.893782 4699 scope.go:117] "RemoveContainer" containerID="cf76bc5ff52b00f4f1862b5dd35e5462be22707578a7621c50daad89a2ee162c" Nov 22 04:38:35 crc kubenswrapper[4699]: I1122 04:38:35.930069 4699 scope.go:117] "RemoveContainer" containerID="f89adf7b40fa021a0f196d11b77ffea1655dd93826c756566309dfba9ac95472" Nov 22 04:38:36 crc kubenswrapper[4699]: I1122 04:38:36.003637 4699 scope.go:117] "RemoveContainer" containerID="b0a2c92d4114c57687fddfd1b595882f3246df62ba032fead209de790d82718d" Nov 22 04:38:36 crc kubenswrapper[4699]: I1122 04:38:36.036230 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-qf7bw"] Nov 22 04:38:36 crc kubenswrapper[4699]: I1122 04:38:36.054196 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-5ae7-account-create-5gj27"] Nov 22 04:38:36 crc kubenswrapper[4699]: I1122 04:38:36.065081 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-qf7bw"] Nov 22 04:38:36 crc kubenswrapper[4699]: I1122 04:38:36.074592 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-5ae7-account-create-5gj27"] Nov 22 04:38:36 crc kubenswrapper[4699]: I1122 04:38:36.081951 4699 scope.go:117] "RemoveContainer" containerID="ed5ad378ee9c82aaf3a68b092e4b798ba21ae9f666d69475f0a13e6d97a8c3c2" Nov 22 04:38:36 crc kubenswrapper[4699]: I1122 04:38:36.143381 4699 scope.go:117] "RemoveContainer" containerID="d3d3fe9c4e2a1ce9c52ff6646ef81226fd80658671f070e8c6123e626e9a0221" Nov 22 04:38:36 crc kubenswrapper[4699]: I1122 04:38:36.176016 4699 scope.go:117] "RemoveContainer" containerID="8d7efbd273318955f317598578a519fd4d002838c29a6f9db5a0fec055aba67f" Nov 22 04:38:36 crc kubenswrapper[4699]: I1122 04:38:36.200103 4699 scope.go:117] "RemoveContainer" containerID="1a0fba0263d2d23aed3014908bb57b46d4434eddf45ad1eb1da0cf9a68a9aa48" Nov 22 04:38:37 crc kubenswrapper[4699]: I1122 04:38:37.066156 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n6pxx/must-gather-6wzzd" event={"ID":"79e77b1e-0b23-42b6-a491-d15ace6ebcac","Type":"ContainerStarted","Data":"330bb1cd56cb9b03b1b4e88715b41bb9dcbd269e4a31a861b07f49d7f90dfb60"} Nov 22 04:38:37 crc kubenswrapper[4699]: I1122 04:38:37.066196 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n6pxx/must-gather-6wzzd" event={"ID":"79e77b1e-0b23-42b6-a491-d15ace6ebcac","Type":"ContainerStarted","Data":"3b0dc36a0accdba80432fb988e774c8b9be7f4fcb4eb2bd16123db8846a03c3a"} Nov 22 04:38:37 crc kubenswrapper[4699]: I1122 04:38:37.083387 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-n6pxx/must-gather-6wzzd" podStartSLOduration=1.955274848 podStartE2EDuration="9.083358036s" podCreationTimestamp="2025-11-22 04:38:28 +0000 UTC" firstStartedPulling="2025-11-22 04:38:28.80673209 +0000 UTC m=+1860.149353277" lastFinishedPulling="2025-11-22 04:38:35.934815268 +0000 UTC m=+1867.277436465" observedRunningTime="2025-11-22 04:38:37.081398728 +0000 UTC m=+1868.424019925" watchObservedRunningTime="2025-11-22 04:38:37.083358036 +0000 UTC m=+1868.425979223" Nov 22 04:38:37 crc kubenswrapper[4699]: I1122 04:38:37.463765 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2aab7b5b-e294-4282-a3b3-75d47c1e911d" path="/var/lib/kubelet/pods/2aab7b5b-e294-4282-a3b3-75d47c1e911d/volumes" Nov 22 04:38:37 crc kubenswrapper[4699]: I1122 04:38:37.464321 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0a913d6-76de-4f73-bc38-83471deabfdb" path="/var/lib/kubelet/pods/a0a913d6-76de-4f73-bc38-83471deabfdb/volumes" Nov 22 04:38:38 crc kubenswrapper[4699]: I1122 04:38:38.447920 4699 scope.go:117] "RemoveContainer" containerID="9f6c8a2daef4dc5617a6b47fc5d58598238dea049bba2ad09b65bd85f946e581" Nov 22 04:38:38 crc kubenswrapper[4699]: E1122 04:38:38.448269 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kjwnt_openshift-machine-config-operator(41bdbae2-706a-4f84-9f56-5a42aec77762)\"" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" Nov 22 04:38:39 crc kubenswrapper[4699]: I1122 04:38:39.712079 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n6pxx/crc-debug-m8g65"] Nov 22 04:38:39 crc kubenswrapper[4699]: I1122 04:38:39.713664 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n6pxx/crc-debug-m8g65" Nov 22 04:38:39 crc kubenswrapper[4699]: I1122 04:38:39.852038 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/73ea01c0-8a6a-450a-bdf4-e2d0a45e26c9-host\") pod \"crc-debug-m8g65\" (UID: \"73ea01c0-8a6a-450a-bdf4-e2d0a45e26c9\") " pod="openshift-must-gather-n6pxx/crc-debug-m8g65" Nov 22 04:38:39 crc kubenswrapper[4699]: I1122 04:38:39.852348 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wg2w\" (UniqueName: \"kubernetes.io/projected/73ea01c0-8a6a-450a-bdf4-e2d0a45e26c9-kube-api-access-6wg2w\") pod \"crc-debug-m8g65\" (UID: \"73ea01c0-8a6a-450a-bdf4-e2d0a45e26c9\") " pod="openshift-must-gather-n6pxx/crc-debug-m8g65" Nov 22 04:38:39 crc kubenswrapper[4699]: I1122 04:38:39.954643 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/73ea01c0-8a6a-450a-bdf4-e2d0a45e26c9-host\") pod \"crc-debug-m8g65\" (UID: \"73ea01c0-8a6a-450a-bdf4-e2d0a45e26c9\") " pod="openshift-must-gather-n6pxx/crc-debug-m8g65" Nov 22 04:38:39 crc kubenswrapper[4699]: I1122 04:38:39.954694 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wg2w\" (UniqueName: \"kubernetes.io/projected/73ea01c0-8a6a-450a-bdf4-e2d0a45e26c9-kube-api-access-6wg2w\") pod \"crc-debug-m8g65\" (UID: \"73ea01c0-8a6a-450a-bdf4-e2d0a45e26c9\") " pod="openshift-must-gather-n6pxx/crc-debug-m8g65" Nov 22 04:38:39 crc kubenswrapper[4699]: I1122 04:38:39.954770 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/73ea01c0-8a6a-450a-bdf4-e2d0a45e26c9-host\") pod \"crc-debug-m8g65\" (UID: \"73ea01c0-8a6a-450a-bdf4-e2d0a45e26c9\") " pod="openshift-must-gather-n6pxx/crc-debug-m8g65" Nov 22 04:38:39 crc kubenswrapper[4699]: I1122 04:38:39.978979 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wg2w\" (UniqueName: \"kubernetes.io/projected/73ea01c0-8a6a-450a-bdf4-e2d0a45e26c9-kube-api-access-6wg2w\") pod \"crc-debug-m8g65\" (UID: \"73ea01c0-8a6a-450a-bdf4-e2d0a45e26c9\") " pod="openshift-must-gather-n6pxx/crc-debug-m8g65" Nov 22 04:38:40 crc kubenswrapper[4699]: I1122 04:38:40.031130 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n6pxx/crc-debug-m8g65" Nov 22 04:38:40 crc kubenswrapper[4699]: I1122 04:38:40.095473 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n6pxx/crc-debug-m8g65" event={"ID":"73ea01c0-8a6a-450a-bdf4-e2d0a45e26c9","Type":"ContainerStarted","Data":"bdcd5e6d5a603460d2dc9501cedb31a95bcc6969b7966b208cb8c6dc6b8fff20"} Nov 22 04:38:48 crc kubenswrapper[4699]: I1122 04:38:48.024725 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-vppfb"] Nov 22 04:38:48 crc kubenswrapper[4699]: I1122 04:38:48.031678 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-5945-account-create-kgfl7"] Nov 22 04:38:48 crc kubenswrapper[4699]: I1122 04:38:48.041113 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-mvxc4"] Nov 22 04:38:48 crc kubenswrapper[4699]: I1122 04:38:48.050698 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-vppfb"] Nov 22 04:38:48 crc kubenswrapper[4699]: I1122 04:38:48.059574 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-mvxc4"] Nov 22 04:38:48 crc kubenswrapper[4699]: I1122 04:38:48.066297 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-5945-account-create-kgfl7"] Nov 22 04:38:49 crc kubenswrapper[4699]: I1122 04:38:49.472848 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d971429-cae7-4fed-9849-343ec7364f54" path="/var/lib/kubelet/pods/4d971429-cae7-4fed-9849-343ec7364f54/volumes" Nov 22 04:38:49 crc kubenswrapper[4699]: I1122 04:38:49.473670 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adab83b4-5c89-4ecd-af55-56492c7421b3" path="/var/lib/kubelet/pods/adab83b4-5c89-4ecd-af55-56492c7421b3/volumes" Nov 22 04:38:49 crc kubenswrapper[4699]: I1122 04:38:49.474228 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e987a5a1-15e5-43db-b896-d68d46cf841d" path="/var/lib/kubelet/pods/e987a5a1-15e5-43db-b896-d68d46cf841d/volumes" Nov 22 04:38:52 crc kubenswrapper[4699]: I1122 04:38:52.256576 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n6pxx/crc-debug-m8g65" event={"ID":"73ea01c0-8a6a-450a-bdf4-e2d0a45e26c9","Type":"ContainerStarted","Data":"3070b298889ac61c182a8dac62b5a28340c139baa0069242281ba8678c51c2b4"} Nov 22 04:38:52 crc kubenswrapper[4699]: I1122 04:38:52.275701 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-n6pxx/crc-debug-m8g65" podStartSLOduration=1.739923196 podStartE2EDuration="13.275673478s" podCreationTimestamp="2025-11-22 04:38:39 +0000 UTC" firstStartedPulling="2025-11-22 04:38:40.072733176 +0000 UTC m=+1871.415354373" lastFinishedPulling="2025-11-22 04:38:51.608483468 +0000 UTC m=+1882.951104655" observedRunningTime="2025-11-22 04:38:52.267543379 +0000 UTC m=+1883.610164586" watchObservedRunningTime="2025-11-22 04:38:52.275673478 +0000 UTC m=+1883.618294685" Nov 22 04:38:53 crc kubenswrapper[4699]: I1122 04:38:53.448133 4699 scope.go:117] "RemoveContainer" containerID="9f6c8a2daef4dc5617a6b47fc5d58598238dea049bba2ad09b65bd85f946e581" Nov 22 04:38:53 crc kubenswrapper[4699]: E1122 04:38:53.448832 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kjwnt_openshift-machine-config-operator(41bdbae2-706a-4f84-9f56-5a42aec77762)\"" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" Nov 22 04:39:05 crc kubenswrapper[4699]: I1122 04:39:05.467130 4699 scope.go:117] "RemoveContainer" containerID="9f6c8a2daef4dc5617a6b47fc5d58598238dea049bba2ad09b65bd85f946e581" Nov 22 04:39:05 crc kubenswrapper[4699]: E1122 04:39:05.482826 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kjwnt_openshift-machine-config-operator(41bdbae2-706a-4f84-9f56-5a42aec77762)\"" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" Nov 22 04:39:20 crc kubenswrapper[4699]: I1122 04:39:20.447812 4699 scope.go:117] "RemoveContainer" containerID="9f6c8a2daef4dc5617a6b47fc5d58598238dea049bba2ad09b65bd85f946e581" Nov 22 04:39:20 crc kubenswrapper[4699]: E1122 04:39:20.448524 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kjwnt_openshift-machine-config-operator(41bdbae2-706a-4f84-9f56-5a42aec77762)\"" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" Nov 22 04:39:33 crc kubenswrapper[4699]: I1122 04:39:33.453196 4699 scope.go:117] "RemoveContainer" containerID="9f6c8a2daef4dc5617a6b47fc5d58598238dea049bba2ad09b65bd85f946e581" Nov 22 04:39:33 crc kubenswrapper[4699]: E1122 04:39:33.453945 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kjwnt_openshift-machine-config-operator(41bdbae2-706a-4f84-9f56-5a42aec77762)\"" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" Nov 22 04:39:34 crc kubenswrapper[4699]: I1122 04:39:34.735897 4699 generic.go:334] "Generic (PLEG): container finished" podID="73ea01c0-8a6a-450a-bdf4-e2d0a45e26c9" containerID="3070b298889ac61c182a8dac62b5a28340c139baa0069242281ba8678c51c2b4" exitCode=0 Nov 22 04:39:34 crc kubenswrapper[4699]: I1122 04:39:34.736029 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n6pxx/crc-debug-m8g65" event={"ID":"73ea01c0-8a6a-450a-bdf4-e2d0a45e26c9","Type":"ContainerDied","Data":"3070b298889ac61c182a8dac62b5a28340c139baa0069242281ba8678c51c2b4"} Nov 22 04:39:35 crc kubenswrapper[4699]: I1122 04:39:35.874875 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n6pxx/crc-debug-m8g65" Nov 22 04:39:35 crc kubenswrapper[4699]: I1122 04:39:35.908281 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n6pxx/crc-debug-m8g65"] Nov 22 04:39:35 crc kubenswrapper[4699]: I1122 04:39:35.916223 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n6pxx/crc-debug-m8g65"] Nov 22 04:39:36 crc kubenswrapper[4699]: I1122 04:39:36.058286 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wg2w\" (UniqueName: \"kubernetes.io/projected/73ea01c0-8a6a-450a-bdf4-e2d0a45e26c9-kube-api-access-6wg2w\") pod \"73ea01c0-8a6a-450a-bdf4-e2d0a45e26c9\" (UID: \"73ea01c0-8a6a-450a-bdf4-e2d0a45e26c9\") " Nov 22 04:39:36 crc kubenswrapper[4699]: I1122 04:39:36.058576 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/73ea01c0-8a6a-450a-bdf4-e2d0a45e26c9-host\") pod \"73ea01c0-8a6a-450a-bdf4-e2d0a45e26c9\" (UID: \"73ea01c0-8a6a-450a-bdf4-e2d0a45e26c9\") " Nov 22 04:39:36 crc kubenswrapper[4699]: I1122 04:39:36.059034 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/73ea01c0-8a6a-450a-bdf4-e2d0a45e26c9-host" (OuterVolumeSpecName: "host") pod "73ea01c0-8a6a-450a-bdf4-e2d0a45e26c9" (UID: "73ea01c0-8a6a-450a-bdf4-e2d0a45e26c9"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:39:36 crc kubenswrapper[4699]: I1122 04:39:36.065935 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73ea01c0-8a6a-450a-bdf4-e2d0a45e26c9-kube-api-access-6wg2w" (OuterVolumeSpecName: "kube-api-access-6wg2w") pod "73ea01c0-8a6a-450a-bdf4-e2d0a45e26c9" (UID: "73ea01c0-8a6a-450a-bdf4-e2d0a45e26c9"). InnerVolumeSpecName "kube-api-access-6wg2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:39:36 crc kubenswrapper[4699]: I1122 04:39:36.160333 4699 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/73ea01c0-8a6a-450a-bdf4-e2d0a45e26c9-host\") on node \"crc\" DevicePath \"\"" Nov 22 04:39:36 crc kubenswrapper[4699]: I1122 04:39:36.160360 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wg2w\" (UniqueName: \"kubernetes.io/projected/73ea01c0-8a6a-450a-bdf4-e2d0a45e26c9-kube-api-access-6wg2w\") on node \"crc\" DevicePath \"\"" Nov 22 04:39:36 crc kubenswrapper[4699]: I1122 04:39:36.402738 4699 scope.go:117] "RemoveContainer" containerID="220d7837f924ca7d5e47951feefec2cc78626f5372dfc8470215283969de5d6f" Nov 22 04:39:36 crc kubenswrapper[4699]: I1122 04:39:36.691704 4699 scope.go:117] "RemoveContainer" containerID="1dc9b48af9a286ac801387a62e24cb3cbd88475a440c0aa202e97d76cb30adc2" Nov 22 04:39:36 crc kubenswrapper[4699]: I1122 04:39:36.757880 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdcd5e6d5a603460d2dc9501cedb31a95bcc6969b7966b208cb8c6dc6b8fff20" Nov 22 04:39:36 crc kubenswrapper[4699]: I1122 04:39:36.757914 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n6pxx/crc-debug-m8g65" Nov 22 04:39:36 crc kubenswrapper[4699]: I1122 04:39:36.809776 4699 scope.go:117] "RemoveContainer" containerID="fbb886ee9e6ed61f05098b0796808eef56e6ceb040d06eb51b58021dc4e58977" Nov 22 04:39:36 crc kubenswrapper[4699]: I1122 04:39:36.833950 4699 scope.go:117] "RemoveContainer" containerID="d283314d07f101264e5c7c630f6a848fddb263bfcd705b67c211e4d5457414ca" Nov 22 04:39:36 crc kubenswrapper[4699]: I1122 04:39:36.877265 4699 scope.go:117] "RemoveContainer" containerID="9d9ea24bfbc6519b1e377f935771807718439cd55362fcfd039944e4a8475657" Nov 22 04:39:37 crc kubenswrapper[4699]: I1122 04:39:37.049104 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wjmlk"] Nov 22 04:39:37 crc kubenswrapper[4699]: I1122 04:39:37.057610 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wjmlk"] Nov 22 04:39:37 crc kubenswrapper[4699]: I1122 04:39:37.098695 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n6pxx/crc-debug-9rqk9"] Nov 22 04:39:37 crc kubenswrapper[4699]: E1122 04:39:37.099054 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73ea01c0-8a6a-450a-bdf4-e2d0a45e26c9" containerName="container-00" Nov 22 04:39:37 crc kubenswrapper[4699]: I1122 04:39:37.099071 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="73ea01c0-8a6a-450a-bdf4-e2d0a45e26c9" containerName="container-00" Nov 22 04:39:37 crc kubenswrapper[4699]: I1122 04:39:37.099298 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="73ea01c0-8a6a-450a-bdf4-e2d0a45e26c9" containerName="container-00" Nov 22 04:39:37 crc kubenswrapper[4699]: I1122 04:39:37.100796 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n6pxx/crc-debug-9rqk9" Nov 22 04:39:37 crc kubenswrapper[4699]: I1122 04:39:37.280820 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmg9d\" (UniqueName: \"kubernetes.io/projected/66b0119c-80a9-429b-a155-364c7a9a23c1-kube-api-access-nmg9d\") pod \"crc-debug-9rqk9\" (UID: \"66b0119c-80a9-429b-a155-364c7a9a23c1\") " pod="openshift-must-gather-n6pxx/crc-debug-9rqk9" Nov 22 04:39:37 crc kubenswrapper[4699]: I1122 04:39:37.281023 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/66b0119c-80a9-429b-a155-364c7a9a23c1-host\") pod \"crc-debug-9rqk9\" (UID: \"66b0119c-80a9-429b-a155-364c7a9a23c1\") " pod="openshift-must-gather-n6pxx/crc-debug-9rqk9" Nov 22 04:39:37 crc kubenswrapper[4699]: I1122 04:39:37.383301 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/66b0119c-80a9-429b-a155-364c7a9a23c1-host\") pod \"crc-debug-9rqk9\" (UID: \"66b0119c-80a9-429b-a155-364c7a9a23c1\") " pod="openshift-must-gather-n6pxx/crc-debug-9rqk9" Nov 22 04:39:37 crc kubenswrapper[4699]: I1122 04:39:37.383478 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/66b0119c-80a9-429b-a155-364c7a9a23c1-host\") pod \"crc-debug-9rqk9\" (UID: \"66b0119c-80a9-429b-a155-364c7a9a23c1\") " pod="openshift-must-gather-n6pxx/crc-debug-9rqk9" Nov 22 04:39:37 crc kubenswrapper[4699]: I1122 04:39:37.383814 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmg9d\" (UniqueName: \"kubernetes.io/projected/66b0119c-80a9-429b-a155-364c7a9a23c1-kube-api-access-nmg9d\") pod \"crc-debug-9rqk9\" (UID: \"66b0119c-80a9-429b-a155-364c7a9a23c1\") " pod="openshift-must-gather-n6pxx/crc-debug-9rqk9" Nov 22 04:39:37 crc kubenswrapper[4699]: I1122 04:39:37.405256 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmg9d\" (UniqueName: \"kubernetes.io/projected/66b0119c-80a9-429b-a155-364c7a9a23c1-kube-api-access-nmg9d\") pod \"crc-debug-9rqk9\" (UID: \"66b0119c-80a9-429b-a155-364c7a9a23c1\") " pod="openshift-must-gather-n6pxx/crc-debug-9rqk9" Nov 22 04:39:37 crc kubenswrapper[4699]: I1122 04:39:37.416706 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n6pxx/crc-debug-9rqk9" Nov 22 04:39:37 crc kubenswrapper[4699]: I1122 04:39:37.464007 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2543fbe9-12e0-40d5-8474-dab6ed3144be" path="/var/lib/kubelet/pods/2543fbe9-12e0-40d5-8474-dab6ed3144be/volumes" Nov 22 04:39:37 crc kubenswrapper[4699]: I1122 04:39:37.464873 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73ea01c0-8a6a-450a-bdf4-e2d0a45e26c9" path="/var/lib/kubelet/pods/73ea01c0-8a6a-450a-bdf4-e2d0a45e26c9/volumes" Nov 22 04:39:37 crc kubenswrapper[4699]: I1122 04:39:37.766491 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n6pxx/crc-debug-9rqk9" event={"ID":"66b0119c-80a9-429b-a155-364c7a9a23c1","Type":"ContainerStarted","Data":"2bcd26cce80a6fae2ac9c8ea56e45116a58b55c94813c5f744c556aa574223ed"} Nov 22 04:39:38 crc kubenswrapper[4699]: I1122 04:39:38.779464 4699 generic.go:334] "Generic (PLEG): container finished" podID="66b0119c-80a9-429b-a155-364c7a9a23c1" containerID="315abea7cb4d9ba49ccd272d4d07989ffc946f8726e1f77aa575f880b06e3651" exitCode=0 Nov 22 04:39:38 crc kubenswrapper[4699]: I1122 04:39:38.779570 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n6pxx/crc-debug-9rqk9" event={"ID":"66b0119c-80a9-429b-a155-364c7a9a23c1","Type":"ContainerDied","Data":"315abea7cb4d9ba49ccd272d4d07989ffc946f8726e1f77aa575f880b06e3651"} Nov 22 04:39:39 crc kubenswrapper[4699]: I1122 04:39:39.268327 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n6pxx/crc-debug-9rqk9"] Nov 22 04:39:39 crc kubenswrapper[4699]: I1122 04:39:39.273684 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n6pxx/crc-debug-9rqk9"] Nov 22 04:39:39 crc kubenswrapper[4699]: I1122 04:39:39.884729 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n6pxx/crc-debug-9rqk9" Nov 22 04:39:40 crc kubenswrapper[4699]: I1122 04:39:40.031272 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmg9d\" (UniqueName: \"kubernetes.io/projected/66b0119c-80a9-429b-a155-364c7a9a23c1-kube-api-access-nmg9d\") pod \"66b0119c-80a9-429b-a155-364c7a9a23c1\" (UID: \"66b0119c-80a9-429b-a155-364c7a9a23c1\") " Nov 22 04:39:40 crc kubenswrapper[4699]: I1122 04:39:40.031489 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/66b0119c-80a9-429b-a155-364c7a9a23c1-host\") pod \"66b0119c-80a9-429b-a155-364c7a9a23c1\" (UID: \"66b0119c-80a9-429b-a155-364c7a9a23c1\") " Nov 22 04:39:40 crc kubenswrapper[4699]: I1122 04:39:40.031598 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/66b0119c-80a9-429b-a155-364c7a9a23c1-host" (OuterVolumeSpecName: "host") pod "66b0119c-80a9-429b-a155-364c7a9a23c1" (UID: "66b0119c-80a9-429b-a155-364c7a9a23c1"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:39:40 crc kubenswrapper[4699]: I1122 04:39:40.032154 4699 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/66b0119c-80a9-429b-a155-364c7a9a23c1-host\") on node \"crc\" DevicePath \"\"" Nov 22 04:39:40 crc kubenswrapper[4699]: I1122 04:39:40.037054 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66b0119c-80a9-429b-a155-364c7a9a23c1-kube-api-access-nmg9d" (OuterVolumeSpecName: "kube-api-access-nmg9d") pod "66b0119c-80a9-429b-a155-364c7a9a23c1" (UID: "66b0119c-80a9-429b-a155-364c7a9a23c1"). InnerVolumeSpecName "kube-api-access-nmg9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:39:40 crc kubenswrapper[4699]: I1122 04:39:40.133644 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmg9d\" (UniqueName: \"kubernetes.io/projected/66b0119c-80a9-429b-a155-364c7a9a23c1-kube-api-access-nmg9d\") on node \"crc\" DevicePath \"\"" Nov 22 04:39:40 crc kubenswrapper[4699]: I1122 04:39:40.330783 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9p456"] Nov 22 04:39:40 crc kubenswrapper[4699]: E1122 04:39:40.331798 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66b0119c-80a9-429b-a155-364c7a9a23c1" containerName="container-00" Nov 22 04:39:40 crc kubenswrapper[4699]: I1122 04:39:40.331907 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="66b0119c-80a9-429b-a155-364c7a9a23c1" containerName="container-00" Nov 22 04:39:40 crc kubenswrapper[4699]: I1122 04:39:40.332277 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="66b0119c-80a9-429b-a155-364c7a9a23c1" containerName="container-00" Nov 22 04:39:40 crc kubenswrapper[4699]: I1122 04:39:40.334644 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9p456" Nov 22 04:39:40 crc kubenswrapper[4699]: I1122 04:39:40.348208 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9p456"] Nov 22 04:39:40 crc kubenswrapper[4699]: I1122 04:39:40.439401 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn2rp\" (UniqueName: \"kubernetes.io/projected/c7101057-977b-48ad-952c-beb8f89cba64-kube-api-access-gn2rp\") pod \"redhat-operators-9p456\" (UID: \"c7101057-977b-48ad-952c-beb8f89cba64\") " pod="openshift-marketplace/redhat-operators-9p456" Nov 22 04:39:40 crc kubenswrapper[4699]: I1122 04:39:40.440037 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7101057-977b-48ad-952c-beb8f89cba64-utilities\") pod \"redhat-operators-9p456\" (UID: \"c7101057-977b-48ad-952c-beb8f89cba64\") " pod="openshift-marketplace/redhat-operators-9p456" Nov 22 04:39:40 crc kubenswrapper[4699]: I1122 04:39:40.440147 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7101057-977b-48ad-952c-beb8f89cba64-catalog-content\") pod \"redhat-operators-9p456\" (UID: \"c7101057-977b-48ad-952c-beb8f89cba64\") " pod="openshift-marketplace/redhat-operators-9p456" Nov 22 04:39:40 crc kubenswrapper[4699]: I1122 04:39:40.515534 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-n6pxx/crc-debug-jq96n"] Nov 22 04:39:40 crc kubenswrapper[4699]: I1122 04:39:40.517210 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n6pxx/crc-debug-jq96n" Nov 22 04:39:40 crc kubenswrapper[4699]: I1122 04:39:40.542139 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gn2rp\" (UniqueName: \"kubernetes.io/projected/c7101057-977b-48ad-952c-beb8f89cba64-kube-api-access-gn2rp\") pod \"redhat-operators-9p456\" (UID: \"c7101057-977b-48ad-952c-beb8f89cba64\") " pod="openshift-marketplace/redhat-operators-9p456" Nov 22 04:39:40 crc kubenswrapper[4699]: I1122 04:39:40.542177 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7101057-977b-48ad-952c-beb8f89cba64-utilities\") pod \"redhat-operators-9p456\" (UID: \"c7101057-977b-48ad-952c-beb8f89cba64\") " pod="openshift-marketplace/redhat-operators-9p456" Nov 22 04:39:40 crc kubenswrapper[4699]: I1122 04:39:40.542197 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7101057-977b-48ad-952c-beb8f89cba64-catalog-content\") pod \"redhat-operators-9p456\" (UID: \"c7101057-977b-48ad-952c-beb8f89cba64\") " pod="openshift-marketplace/redhat-operators-9p456" Nov 22 04:39:40 crc kubenswrapper[4699]: I1122 04:39:40.542805 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7101057-977b-48ad-952c-beb8f89cba64-catalog-content\") pod \"redhat-operators-9p456\" (UID: \"c7101057-977b-48ad-952c-beb8f89cba64\") " pod="openshift-marketplace/redhat-operators-9p456" Nov 22 04:39:40 crc kubenswrapper[4699]: I1122 04:39:40.543208 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7101057-977b-48ad-952c-beb8f89cba64-utilities\") pod \"redhat-operators-9p456\" (UID: \"c7101057-977b-48ad-952c-beb8f89cba64\") " pod="openshift-marketplace/redhat-operators-9p456" Nov 22 04:39:40 crc kubenswrapper[4699]: I1122 04:39:40.561799 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn2rp\" (UniqueName: \"kubernetes.io/projected/c7101057-977b-48ad-952c-beb8f89cba64-kube-api-access-gn2rp\") pod \"redhat-operators-9p456\" (UID: \"c7101057-977b-48ad-952c-beb8f89cba64\") " pod="openshift-marketplace/redhat-operators-9p456" Nov 22 04:39:40 crc kubenswrapper[4699]: I1122 04:39:40.644042 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6cbbb1af-c265-4fd4-81ba-51a83ae30668-host\") pod \"crc-debug-jq96n\" (UID: \"6cbbb1af-c265-4fd4-81ba-51a83ae30668\") " pod="openshift-must-gather-n6pxx/crc-debug-jq96n" Nov 22 04:39:40 crc kubenswrapper[4699]: I1122 04:39:40.644133 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tqgg\" (UniqueName: \"kubernetes.io/projected/6cbbb1af-c265-4fd4-81ba-51a83ae30668-kube-api-access-4tqgg\") pod \"crc-debug-jq96n\" (UID: \"6cbbb1af-c265-4fd4-81ba-51a83ae30668\") " pod="openshift-must-gather-n6pxx/crc-debug-jq96n" Nov 22 04:39:40 crc kubenswrapper[4699]: I1122 04:39:40.653746 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9p456" Nov 22 04:39:40 crc kubenswrapper[4699]: I1122 04:39:40.745721 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6cbbb1af-c265-4fd4-81ba-51a83ae30668-host\") pod \"crc-debug-jq96n\" (UID: \"6cbbb1af-c265-4fd4-81ba-51a83ae30668\") " pod="openshift-must-gather-n6pxx/crc-debug-jq96n" Nov 22 04:39:40 crc kubenswrapper[4699]: I1122 04:39:40.745815 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tqgg\" (UniqueName: \"kubernetes.io/projected/6cbbb1af-c265-4fd4-81ba-51a83ae30668-kube-api-access-4tqgg\") pod \"crc-debug-jq96n\" (UID: \"6cbbb1af-c265-4fd4-81ba-51a83ae30668\") " pod="openshift-must-gather-n6pxx/crc-debug-jq96n" Nov 22 04:39:40 crc kubenswrapper[4699]: I1122 04:39:40.746306 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6cbbb1af-c265-4fd4-81ba-51a83ae30668-host\") pod \"crc-debug-jq96n\" (UID: \"6cbbb1af-c265-4fd4-81ba-51a83ae30668\") " pod="openshift-must-gather-n6pxx/crc-debug-jq96n" Nov 22 04:39:40 crc kubenswrapper[4699]: I1122 04:39:40.777314 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tqgg\" (UniqueName: \"kubernetes.io/projected/6cbbb1af-c265-4fd4-81ba-51a83ae30668-kube-api-access-4tqgg\") pod \"crc-debug-jq96n\" (UID: \"6cbbb1af-c265-4fd4-81ba-51a83ae30668\") " pod="openshift-must-gather-n6pxx/crc-debug-jq96n" Nov 22 04:39:40 crc kubenswrapper[4699]: I1122 04:39:40.806191 4699 scope.go:117] "RemoveContainer" containerID="315abea7cb4d9ba49ccd272d4d07989ffc946f8726e1f77aa575f880b06e3651" Nov 22 04:39:40 crc kubenswrapper[4699]: I1122 04:39:40.806288 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n6pxx/crc-debug-9rqk9" Nov 22 04:39:40 crc kubenswrapper[4699]: I1122 04:39:40.839353 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n6pxx/crc-debug-jq96n" Nov 22 04:39:41 crc kubenswrapper[4699]: I1122 04:39:41.128342 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9p456"] Nov 22 04:39:41 crc kubenswrapper[4699]: I1122 04:39:41.458577 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66b0119c-80a9-429b-a155-364c7a9a23c1" path="/var/lib/kubelet/pods/66b0119c-80a9-429b-a155-364c7a9a23c1/volumes" Nov 22 04:39:41 crc kubenswrapper[4699]: I1122 04:39:41.824757 4699 generic.go:334] "Generic (PLEG): container finished" podID="6cbbb1af-c265-4fd4-81ba-51a83ae30668" containerID="4158ecb1f732a5c310b6dfdc4bb9e4752e2869c1f269e3e328cbeb2c6dc9862a" exitCode=0 Nov 22 04:39:41 crc kubenswrapper[4699]: I1122 04:39:41.824904 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n6pxx/crc-debug-jq96n" event={"ID":"6cbbb1af-c265-4fd4-81ba-51a83ae30668","Type":"ContainerDied","Data":"4158ecb1f732a5c310b6dfdc4bb9e4752e2869c1f269e3e328cbeb2c6dc9862a"} Nov 22 04:39:41 crc kubenswrapper[4699]: I1122 04:39:41.825271 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n6pxx/crc-debug-jq96n" event={"ID":"6cbbb1af-c265-4fd4-81ba-51a83ae30668","Type":"ContainerStarted","Data":"f060c42d7d3c66f3f67b9592056dad0f3d37848ab5cec7ee113c9ae04e0620cf"} Nov 22 04:39:41 crc kubenswrapper[4699]: I1122 04:39:41.835782 4699 generic.go:334] "Generic (PLEG): container finished" podID="c7101057-977b-48ad-952c-beb8f89cba64" containerID="ab8173a70d6e998aa234289b8b950243dca2ddaad5a9421c949f0e430a351a9a" exitCode=0 Nov 22 04:39:41 crc kubenswrapper[4699]: I1122 04:39:41.835881 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9p456" event={"ID":"c7101057-977b-48ad-952c-beb8f89cba64","Type":"ContainerDied","Data":"ab8173a70d6e998aa234289b8b950243dca2ddaad5a9421c949f0e430a351a9a"} Nov 22 04:39:41 crc kubenswrapper[4699]: I1122 04:39:41.837257 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9p456" event={"ID":"c7101057-977b-48ad-952c-beb8f89cba64","Type":"ContainerStarted","Data":"4dd8166ded62e35114219b9e9caa48f43d4af9fcc2eff2d82c63d79b7beeeab8"} Nov 22 04:39:41 crc kubenswrapper[4699]: I1122 04:39:41.845937 4699 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 04:39:41 crc kubenswrapper[4699]: I1122 04:39:41.876616 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n6pxx/crc-debug-jq96n"] Nov 22 04:39:41 crc kubenswrapper[4699]: I1122 04:39:41.883675 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n6pxx/crc-debug-jq96n"] Nov 22 04:39:42 crc kubenswrapper[4699]: I1122 04:39:42.859131 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9p456" event={"ID":"c7101057-977b-48ad-952c-beb8f89cba64","Type":"ContainerStarted","Data":"d2afd20ef2108267a68c5dc16a414af5bbfdbec6b56bbae93a51eb60a2d51c02"} Nov 22 04:39:42 crc kubenswrapper[4699]: I1122 04:39:42.966242 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n6pxx/crc-debug-jq96n" Nov 22 04:39:43 crc kubenswrapper[4699]: I1122 04:39:43.094683 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6cbbb1af-c265-4fd4-81ba-51a83ae30668-host\") pod \"6cbbb1af-c265-4fd4-81ba-51a83ae30668\" (UID: \"6cbbb1af-c265-4fd4-81ba-51a83ae30668\") " Nov 22 04:39:43 crc kubenswrapper[4699]: I1122 04:39:43.094819 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6cbbb1af-c265-4fd4-81ba-51a83ae30668-host" (OuterVolumeSpecName: "host") pod "6cbbb1af-c265-4fd4-81ba-51a83ae30668" (UID: "6cbbb1af-c265-4fd4-81ba-51a83ae30668"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:39:43 crc kubenswrapper[4699]: I1122 04:39:43.094839 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tqgg\" (UniqueName: \"kubernetes.io/projected/6cbbb1af-c265-4fd4-81ba-51a83ae30668-kube-api-access-4tqgg\") pod \"6cbbb1af-c265-4fd4-81ba-51a83ae30668\" (UID: \"6cbbb1af-c265-4fd4-81ba-51a83ae30668\") " Nov 22 04:39:43 crc kubenswrapper[4699]: I1122 04:39:43.095841 4699 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6cbbb1af-c265-4fd4-81ba-51a83ae30668-host\") on node \"crc\" DevicePath \"\"" Nov 22 04:39:43 crc kubenswrapper[4699]: I1122 04:39:43.100613 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cbbb1af-c265-4fd4-81ba-51a83ae30668-kube-api-access-4tqgg" (OuterVolumeSpecName: "kube-api-access-4tqgg") pod "6cbbb1af-c265-4fd4-81ba-51a83ae30668" (UID: "6cbbb1af-c265-4fd4-81ba-51a83ae30668"). InnerVolumeSpecName "kube-api-access-4tqgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:39:43 crc kubenswrapper[4699]: I1122 04:39:43.197772 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tqgg\" (UniqueName: \"kubernetes.io/projected/6cbbb1af-c265-4fd4-81ba-51a83ae30668-kube-api-access-4tqgg\") on node \"crc\" DevicePath \"\"" Nov 22 04:39:43 crc kubenswrapper[4699]: I1122 04:39:43.457647 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cbbb1af-c265-4fd4-81ba-51a83ae30668" path="/var/lib/kubelet/pods/6cbbb1af-c265-4fd4-81ba-51a83ae30668/volumes" Nov 22 04:39:43 crc kubenswrapper[4699]: I1122 04:39:43.870516 4699 scope.go:117] "RemoveContainer" containerID="4158ecb1f732a5c310b6dfdc4bb9e4752e2869c1f269e3e328cbeb2c6dc9862a" Nov 22 04:39:43 crc kubenswrapper[4699]: I1122 04:39:43.870526 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n6pxx/crc-debug-jq96n" Nov 22 04:39:44 crc kubenswrapper[4699]: I1122 04:39:44.885033 4699 generic.go:334] "Generic (PLEG): container finished" podID="c7101057-977b-48ad-952c-beb8f89cba64" containerID="d2afd20ef2108267a68c5dc16a414af5bbfdbec6b56bbae93a51eb60a2d51c02" exitCode=0 Nov 22 04:39:44 crc kubenswrapper[4699]: I1122 04:39:44.885212 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9p456" event={"ID":"c7101057-977b-48ad-952c-beb8f89cba64","Type":"ContainerDied","Data":"d2afd20ef2108267a68c5dc16a414af5bbfdbec6b56bbae93a51eb60a2d51c02"} Nov 22 04:39:45 crc kubenswrapper[4699]: I1122 04:39:45.448687 4699 scope.go:117] "RemoveContainer" containerID="9f6c8a2daef4dc5617a6b47fc5d58598238dea049bba2ad09b65bd85f946e581" Nov 22 04:39:45 crc kubenswrapper[4699]: E1122 04:39:45.448976 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kjwnt_openshift-machine-config-operator(41bdbae2-706a-4f84-9f56-5a42aec77762)\"" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" Nov 22 04:39:46 crc kubenswrapper[4699]: I1122 04:39:46.905670 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9p456" event={"ID":"c7101057-977b-48ad-952c-beb8f89cba64","Type":"ContainerStarted","Data":"fc10fbd1f995cb3d1dcbb5e4bfc737b17989375334081cd86b0d892d16f19cdc"} Nov 22 04:39:46 crc kubenswrapper[4699]: I1122 04:39:46.929991 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9p456" podStartSLOduration=2.733257312 podStartE2EDuration="6.929900227s" podCreationTimestamp="2025-11-22 04:39:40 +0000 UTC" firstStartedPulling="2025-11-22 04:39:41.845694525 +0000 UTC m=+1933.188315712" lastFinishedPulling="2025-11-22 04:39:46.04233744 +0000 UTC m=+1937.384958627" observedRunningTime="2025-11-22 04:39:46.927990661 +0000 UTC m=+1938.270611848" watchObservedRunningTime="2025-11-22 04:39:46.929900227 +0000 UTC m=+1938.272521414" Nov 22 04:39:50 crc kubenswrapper[4699]: I1122 04:39:50.654395 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9p456" Nov 22 04:39:50 crc kubenswrapper[4699]: I1122 04:39:50.655993 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9p456" Nov 22 04:39:51 crc kubenswrapper[4699]: I1122 04:39:51.703346 4699 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9p456" podUID="c7101057-977b-48ad-952c-beb8f89cba64" containerName="registry-server" probeResult="failure" output=< Nov 22 04:39:51 crc kubenswrapper[4699]: timeout: failed to connect service ":50051" within 1s Nov 22 04:39:51 crc kubenswrapper[4699]: > Nov 22 04:39:57 crc kubenswrapper[4699]: I1122 04:39:57.448128 4699 scope.go:117] "RemoveContainer" containerID="9f6c8a2daef4dc5617a6b47fc5d58598238dea049bba2ad09b65bd85f946e581" Nov 22 04:39:57 crc kubenswrapper[4699]: E1122 04:39:57.448965 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kjwnt_openshift-machine-config-operator(41bdbae2-706a-4f84-9f56-5a42aec77762)\"" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" Nov 22 04:39:58 crc kubenswrapper[4699]: I1122 04:39:58.773960 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7f47885746-l8msw_4c29f46a-251d-4422-a524-d5745603c348/barbican-api/0.log" Nov 22 04:39:58 crc kubenswrapper[4699]: I1122 04:39:58.900105 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7f47885746-l8msw_4c29f46a-251d-4422-a524-d5745603c348/barbican-api-log/0.log" Nov 22 04:39:58 crc kubenswrapper[4699]: I1122 04:39:58.959751 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-78b588d944-t7d25_da5bf8fa-2592-445a-acfc-56e044b4291c/barbican-keystone-listener/0.log" Nov 22 04:39:59 crc kubenswrapper[4699]: I1122 04:39:59.142195 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-78b588d944-t7d25_da5bf8fa-2592-445a-acfc-56e044b4291c/barbican-keystone-listener-log/0.log" Nov 22 04:39:59 crc kubenswrapper[4699]: I1122 04:39:59.152797 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-85df448b85-c7qlg_6434b63e-cd0f-4cc2-aa3e-463cbf9e7800/barbican-worker/0.log" Nov 22 04:39:59 crc kubenswrapper[4699]: I1122 04:39:59.200583 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-85df448b85-c7qlg_6434b63e-cd0f-4cc2-aa3e-463cbf9e7800/barbican-worker-log/0.log" Nov 22 04:39:59 crc kubenswrapper[4699]: I1122 04:39:59.415916 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_7858372b-0809-42b6-a01d-9db6f85d6c90/proxy-httpd/0.log" Nov 22 04:39:59 crc kubenswrapper[4699]: I1122 04:39:59.419293 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_7858372b-0809-42b6-a01d-9db6f85d6c90/ceilometer-central-agent/0.log" Nov 22 04:39:59 crc kubenswrapper[4699]: I1122 04:39:59.422552 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_7858372b-0809-42b6-a01d-9db6f85d6c90/ceilometer-notification-agent/0.log" Nov 22 04:39:59 crc kubenswrapper[4699]: I1122 04:39:59.515440 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_7858372b-0809-42b6-a01d-9db6f85d6c90/sg-core/0.log" Nov 22 04:39:59 crc kubenswrapper[4699]: I1122 04:39:59.629596 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a79d9f7b-c6a8-44bc-a2c7-65467492cff2/cinder-api-log/0.log" Nov 22 04:39:59 crc kubenswrapper[4699]: I1122 04:39:59.731878 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a79d9f7b-c6a8-44bc-a2c7-65467492cff2/cinder-api/0.log" Nov 22 04:39:59 crc kubenswrapper[4699]: I1122 04:39:59.844076 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a9b1d7c8-7353-480a-aa6f-7031b5228838/probe/0.log" Nov 22 04:39:59 crc kubenswrapper[4699]: I1122 04:39:59.883503 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a9b1d7c8-7353-480a-aa6f-7031b5228838/cinder-scheduler/0.log" Nov 22 04:39:59 crc kubenswrapper[4699]: I1122 04:39:59.950561 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-89c5cd4d5-q6kxn_c75eb722-836a-4b9f-ab34-1dc246154092/init/0.log" Nov 22 04:40:00 crc kubenswrapper[4699]: I1122 04:40:00.039865 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-72t27"] Nov 22 04:40:00 crc kubenswrapper[4699]: I1122 04:40:00.047848 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-72t27"] Nov 22 04:40:00 crc kubenswrapper[4699]: I1122 04:40:00.198881 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-89c5cd4d5-q6kxn_c75eb722-836a-4b9f-ab34-1dc246154092/dnsmasq-dns/0.log" Nov 22 04:40:00 crc kubenswrapper[4699]: I1122 04:40:00.220118 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_3e4aa03f-40fe-45cb-8a03-445afd58f5b7/glance-httpd/0.log" Nov 22 04:40:00 crc kubenswrapper[4699]: I1122 04:40:00.233232 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-89c5cd4d5-q6kxn_c75eb722-836a-4b9f-ab34-1dc246154092/init/0.log" Nov 22 04:40:00 crc kubenswrapper[4699]: I1122 04:40:00.365033 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_3e4aa03f-40fe-45cb-8a03-445afd58f5b7/glance-log/0.log" Nov 22 04:40:00 crc kubenswrapper[4699]: I1122 04:40:00.417759 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c0c750d0-0c65-4609-8ce0-5634ce490fc2/glance-httpd/0.log" Nov 22 04:40:00 crc kubenswrapper[4699]: I1122 04:40:00.593194 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c0c750d0-0c65-4609-8ce0-5634ce490fc2/glance-log/0.log" Nov 22 04:40:00 crc kubenswrapper[4699]: I1122 04:40:00.716723 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9p456" Nov 22 04:40:00 crc kubenswrapper[4699]: I1122 04:40:00.761182 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-554db96b96-4xcnr_e01db47e-4633-40f5-ad23-14867d89eba8/init/0.log" Nov 22 04:40:00 crc kubenswrapper[4699]: I1122 04:40:00.776520 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9p456" Nov 22 04:40:00 crc kubenswrapper[4699]: I1122 04:40:00.875753 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-554db96b96-4xcnr_e01db47e-4633-40f5-ad23-14867d89eba8/ironic-api-log/0.log" Nov 22 04:40:00 crc kubenswrapper[4699]: I1122 04:40:00.909373 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-554db96b96-4xcnr_e01db47e-4633-40f5-ad23-14867d89eba8/ironic-api/0.log" Nov 22 04:40:00 crc kubenswrapper[4699]: I1122 04:40:00.926382 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-554db96b96-4xcnr_e01db47e-4633-40f5-ad23-14867d89eba8/init/0.log" Nov 22 04:40:00 crc kubenswrapper[4699]: I1122 04:40:00.951049 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9p456"] Nov 22 04:40:01 crc kubenswrapper[4699]: I1122 04:40:01.077970 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_6b0a42c8-e8a1-45b3-9f29-77459d98ea4d/init/0.log" Nov 22 04:40:01 crc kubenswrapper[4699]: I1122 04:40:01.259083 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_6b0a42c8-e8a1-45b3-9f29-77459d98ea4d/init/0.log" Nov 22 04:40:01 crc kubenswrapper[4699]: I1122 04:40:01.276286 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_6b0a42c8-e8a1-45b3-9f29-77459d98ea4d/ironic-python-agent-init/0.log" Nov 22 04:40:01 crc kubenswrapper[4699]: I1122 04:40:01.276362 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_6b0a42c8-e8a1-45b3-9f29-77459d98ea4d/ironic-python-agent-init/0.log" Nov 22 04:40:01 crc kubenswrapper[4699]: I1122 04:40:01.476281 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88235903-ad93-439b-94a0-cd4afd05370f" path="/var/lib/kubelet/pods/88235903-ad93-439b-94a0-cd4afd05370f/volumes" Nov 22 04:40:01 crc kubenswrapper[4699]: I1122 04:40:01.494030 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_6b0a42c8-e8a1-45b3-9f29-77459d98ea4d/ironic-python-agent-init/0.log" Nov 22 04:40:01 crc kubenswrapper[4699]: I1122 04:40:01.530217 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_6b0a42c8-e8a1-45b3-9f29-77459d98ea4d/init/0.log" Nov 22 04:40:01 crc kubenswrapper[4699]: I1122 04:40:01.935224 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_6b0a42c8-e8a1-45b3-9f29-77459d98ea4d/init/0.log" Nov 22 04:40:02 crc kubenswrapper[4699]: I1122 04:40:02.028485 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zvrvx"] Nov 22 04:40:02 crc kubenswrapper[4699]: I1122 04:40:02.037768 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9p456" podUID="c7101057-977b-48ad-952c-beb8f89cba64" containerName="registry-server" containerID="cri-o://fc10fbd1f995cb3d1dcbb5e4bfc737b17989375334081cd86b0d892d16f19cdc" gracePeriod=2 Nov 22 04:40:02 crc kubenswrapper[4699]: I1122 04:40:02.037806 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zvrvx"] Nov 22 04:40:02 crc kubenswrapper[4699]: I1122 04:40:02.142547 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_6b0a42c8-e8a1-45b3-9f29-77459d98ea4d/ironic-python-agent-init/0.log" Nov 22 04:40:02 crc kubenswrapper[4699]: I1122 04:40:02.434647 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_6b0a42c8-e8a1-45b3-9f29-77459d98ea4d/httpboot/0.log" Nov 22 04:40:02 crc kubenswrapper[4699]: I1122 04:40:02.597263 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_6b0a42c8-e8a1-45b3-9f29-77459d98ea4d/pxe-init/0.log" Nov 22 04:40:02 crc kubenswrapper[4699]: I1122 04:40:02.721284 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_6b0a42c8-e8a1-45b3-9f29-77459d98ea4d/ironic-conductor/0.log" Nov 22 04:40:02 crc kubenswrapper[4699]: I1122 04:40:02.800341 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_6b0a42c8-e8a1-45b3-9f29-77459d98ea4d/ramdisk-logs/0.log" Nov 22 04:40:02 crc kubenswrapper[4699]: I1122 04:40:02.986731 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_6b0a42c8-e8a1-45b3-9f29-77459d98ea4d/pxe-init/0.log" Nov 22 04:40:03 crc kubenswrapper[4699]: I1122 04:40:03.049250 4699 generic.go:334] "Generic (PLEG): container finished" podID="c7101057-977b-48ad-952c-beb8f89cba64" containerID="fc10fbd1f995cb3d1dcbb5e4bfc737b17989375334081cd86b0d892d16f19cdc" exitCode=0 Nov 22 04:40:03 crc kubenswrapper[4699]: I1122 04:40:03.049297 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9p456" event={"ID":"c7101057-977b-48ad-952c-beb8f89cba64","Type":"ContainerDied","Data":"fc10fbd1f995cb3d1dcbb5e4bfc737b17989375334081cd86b0d892d16f19cdc"} Nov 22 04:40:03 crc kubenswrapper[4699]: I1122 04:40:03.067352 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-db-sync-bd6j2_19251598-5cdb-4e4f-9eb7-05cd21d988fb/init/0.log" Nov 22 04:40:03 crc kubenswrapper[4699]: I1122 04:40:03.239330 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_6b0a42c8-e8a1-45b3-9f29-77459d98ea4d/pxe-init/0.log" Nov 22 04:40:03 crc kubenswrapper[4699]: I1122 04:40:03.243496 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9p456" Nov 22 04:40:03 crc kubenswrapper[4699]: I1122 04:40:03.246572 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-db-sync-bd6j2_19251598-5cdb-4e4f-9eb7-05cd21d988fb/init/0.log" Nov 22 04:40:03 crc kubenswrapper[4699]: I1122 04:40:03.268612 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-db-sync-bd6j2_19251598-5cdb-4e4f-9eb7-05cd21d988fb/ironic-db-sync/0.log" Nov 22 04:40:03 crc kubenswrapper[4699]: I1122 04:40:03.297396 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7101057-977b-48ad-952c-beb8f89cba64-catalog-content\") pod \"c7101057-977b-48ad-952c-beb8f89cba64\" (UID: \"c7101057-977b-48ad-952c-beb8f89cba64\") " Nov 22 04:40:03 crc kubenswrapper[4699]: I1122 04:40:03.297577 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gn2rp\" (UniqueName: \"kubernetes.io/projected/c7101057-977b-48ad-952c-beb8f89cba64-kube-api-access-gn2rp\") pod \"c7101057-977b-48ad-952c-beb8f89cba64\" (UID: \"c7101057-977b-48ad-952c-beb8f89cba64\") " Nov 22 04:40:03 crc kubenswrapper[4699]: I1122 04:40:03.297671 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7101057-977b-48ad-952c-beb8f89cba64-utilities\") pod \"c7101057-977b-48ad-952c-beb8f89cba64\" (UID: \"c7101057-977b-48ad-952c-beb8f89cba64\") " Nov 22 04:40:03 crc kubenswrapper[4699]: I1122 04:40:03.299259 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7101057-977b-48ad-952c-beb8f89cba64-utilities" (OuterVolumeSpecName: "utilities") pod "c7101057-977b-48ad-952c-beb8f89cba64" (UID: "c7101057-977b-48ad-952c-beb8f89cba64"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:40:03 crc kubenswrapper[4699]: I1122 04:40:03.305572 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7101057-977b-48ad-952c-beb8f89cba64-kube-api-access-gn2rp" (OuterVolumeSpecName: "kube-api-access-gn2rp") pod "c7101057-977b-48ad-952c-beb8f89cba64" (UID: "c7101057-977b-48ad-952c-beb8f89cba64"). InnerVolumeSpecName "kube-api-access-gn2rp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:40:03 crc kubenswrapper[4699]: I1122 04:40:03.389749 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7101057-977b-48ad-952c-beb8f89cba64-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c7101057-977b-48ad-952c-beb8f89cba64" (UID: "c7101057-977b-48ad-952c-beb8f89cba64"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:40:03 crc kubenswrapper[4699]: I1122 04:40:03.401448 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7101057-977b-48ad-952c-beb8f89cba64-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:40:03 crc kubenswrapper[4699]: I1122 04:40:03.401527 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gn2rp\" (UniqueName: \"kubernetes.io/projected/c7101057-977b-48ad-952c-beb8f89cba64-kube-api-access-gn2rp\") on node \"crc\" DevicePath \"\"" Nov 22 04:40:03 crc kubenswrapper[4699]: I1122 04:40:03.401545 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7101057-977b-48ad-952c-beb8f89cba64-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:40:03 crc kubenswrapper[4699]: I1122 04:40:03.458049 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_3c8864ff-6365-44c6-8fe0-134c7f25b176/ironic-python-agent-init/0.log" Nov 22 04:40:03 crc kubenswrapper[4699]: I1122 04:40:03.458211 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1" path="/var/lib/kubelet/pods/cfd327ba-6eeb-41a6-95f9-f2ad2385fcd1/volumes" Nov 22 04:40:03 crc kubenswrapper[4699]: I1122 04:40:03.556604 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_6b0a42c8-e8a1-45b3-9f29-77459d98ea4d/pxe-init/0.log" Nov 22 04:40:03 crc kubenswrapper[4699]: I1122 04:40:03.633098 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_3c8864ff-6365-44c6-8fe0-134c7f25b176/ironic-python-agent-init/0.log" Nov 22 04:40:03 crc kubenswrapper[4699]: I1122 04:40:03.656517 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_3c8864ff-6365-44c6-8fe0-134c7f25b176/inspector-pxe-init/0.log" Nov 22 04:40:03 crc kubenswrapper[4699]: I1122 04:40:03.665649 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_3c8864ff-6365-44c6-8fe0-134c7f25b176/inspector-pxe-init/0.log" Nov 22 04:40:03 crc kubenswrapper[4699]: I1122 04:40:03.856969 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_3c8864ff-6365-44c6-8fe0-134c7f25b176/ironic-python-agent-init/0.log" Nov 22 04:40:03 crc kubenswrapper[4699]: I1122 04:40:03.875129 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_3c8864ff-6365-44c6-8fe0-134c7f25b176/inspector-httpboot/0.log" Nov 22 04:40:03 crc kubenswrapper[4699]: I1122 04:40:03.900014 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_3c8864ff-6365-44c6-8fe0-134c7f25b176/ironic-inspector-httpd/0.log" Nov 22 04:40:03 crc kubenswrapper[4699]: I1122 04:40:03.915507 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_3c8864ff-6365-44c6-8fe0-134c7f25b176/ironic-inspector/0.log" Nov 22 04:40:03 crc kubenswrapper[4699]: I1122 04:40:03.919340 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_3c8864ff-6365-44c6-8fe0-134c7f25b176/inspector-pxe-init/0.log" Nov 22 04:40:04 crc kubenswrapper[4699]: I1122 04:40:04.031851 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_3c8864ff-6365-44c6-8fe0-134c7f25b176/ramdisk-logs/0.log" Nov 22 04:40:04 crc kubenswrapper[4699]: I1122 04:40:04.059868 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9p456" event={"ID":"c7101057-977b-48ad-952c-beb8f89cba64","Type":"ContainerDied","Data":"4dd8166ded62e35114219b9e9caa48f43d4af9fcc2eff2d82c63d79b7beeeab8"} Nov 22 04:40:04 crc kubenswrapper[4699]: I1122 04:40:04.059926 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9p456" Nov 22 04:40:04 crc kubenswrapper[4699]: I1122 04:40:04.059941 4699 scope.go:117] "RemoveContainer" containerID="fc10fbd1f995cb3d1dcbb5e4bfc737b17989375334081cd86b0d892d16f19cdc" Nov 22 04:40:04 crc kubenswrapper[4699]: I1122 04:40:04.076841 4699 scope.go:117] "RemoveContainer" containerID="d2afd20ef2108267a68c5dc16a414af5bbfdbec6b56bbae93a51eb60a2d51c02" Nov 22 04:40:04 crc kubenswrapper[4699]: I1122 04:40:04.087715 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-db-sync-kdn8x_57ead407-5bf6-4cc4-ac17-e939d329f220/ironic-inspector-db-sync/0.log" Nov 22 04:40:04 crc kubenswrapper[4699]: I1122 04:40:04.089021 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9p456"] Nov 22 04:40:04 crc kubenswrapper[4699]: I1122 04:40:04.101081 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9p456"] Nov 22 04:40:04 crc kubenswrapper[4699]: I1122 04:40:04.112256 4699 scope.go:117] "RemoveContainer" containerID="ab8173a70d6e998aa234289b8b950243dca2ddaad5a9421c949f0e430a351a9a" Nov 22 04:40:04 crc kubenswrapper[4699]: I1122 04:40:04.216819 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-neutron-agent-65957c9c4f-4rj2b_474af2c7-c72f-4420-94a9-4876e0dbd68e/ironic-neutron-agent/2.log" Nov 22 04:40:04 crc kubenswrapper[4699]: I1122 04:40:04.291337 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-neutron-agent-65957c9c4f-4rj2b_474af2c7-c72f-4420-94a9-4876e0dbd68e/ironic-neutron-agent/1.log" Nov 22 04:40:04 crc kubenswrapper[4699]: I1122 04:40:04.458747 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_ef3b350d-96dc-4b7f-bc63-586d92e57da6/kube-state-metrics/0.log" Nov 22 04:40:04 crc kubenswrapper[4699]: I1122 04:40:04.509483 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6bf6559788-s4hk6_15aff0a7-6c4f-449c-addf-6cea805a4820/keystone-api/0.log" Nov 22 04:40:04 crc kubenswrapper[4699]: I1122 04:40:04.738918 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-566cbdbc45-ld9jb_4c5bbb47-8099-4bbb-b8a0-d2a56265522b/neutron-httpd/0.log" Nov 22 04:40:04 crc kubenswrapper[4699]: I1122 04:40:04.800795 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-566cbdbc45-ld9jb_4c5bbb47-8099-4bbb-b8a0-d2a56265522b/neutron-api/0.log" Nov 22 04:40:05 crc kubenswrapper[4699]: I1122 04:40:05.084095 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_6c614af4-7edb-4d51-9b42-5826d1cf656b/nova-api-log/0.log" Nov 22 04:40:05 crc kubenswrapper[4699]: I1122 04:40:05.092755 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_6c614af4-7edb-4d51-9b42-5826d1cf656b/nova-api-api/0.log" Nov 22 04:40:05 crc kubenswrapper[4699]: I1122 04:40:05.381210 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-cell-mapping-s776x_5273f2af-0355-484d-a907-589de1193a32/nova-manage/0.log" Nov 22 04:40:05 crc kubenswrapper[4699]: I1122 04:40:05.432980 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_8624913e-b73b-41b8-ac5e-64d9114de859/nova-cell0-conductor-conductor/0.log" Nov 22 04:40:05 crc kubenswrapper[4699]: I1122 04:40:05.460049 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7101057-977b-48ad-952c-beb8f89cba64" path="/var/lib/kubelet/pods/c7101057-977b-48ad-952c-beb8f89cba64/volumes" Nov 22 04:40:05 crc kubenswrapper[4699]: I1122 04:40:05.712748 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_88bb930e-50f0-4126-8410-cc3dbb3e864b/nova-cell1-conductor-conductor/0.log" Nov 22 04:40:05 crc kubenswrapper[4699]: I1122 04:40:05.922650 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_a36302e5-6f2a-4c2a-80db-9d02fea03316/nova-cell1-novncproxy-novncproxy/0.log" Nov 22 04:40:06 crc kubenswrapper[4699]: I1122 04:40:06.144673 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_da5d8c3b-bc84-4687-8f1d-c4763aba383c/nova-metadata-log/0.log" Nov 22 04:40:06 crc kubenswrapper[4699]: I1122 04:40:06.407301 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_6bdcc9f1-da80-479b-b5d2-f4487ed993c7/nova-scheduler-scheduler/0.log" Nov 22 04:40:06 crc kubenswrapper[4699]: I1122 04:40:06.414369 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_da5d8c3b-bc84-4687-8f1d-c4763aba383c/nova-metadata-metadata/0.log" Nov 22 04:40:06 crc kubenswrapper[4699]: I1122 04:40:06.542550 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e74585bc-d1cf-473d-95ca-12c816ff0020/mysql-bootstrap/0.log" Nov 22 04:40:06 crc kubenswrapper[4699]: I1122 04:40:06.782313 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e74585bc-d1cf-473d-95ca-12c816ff0020/mysql-bootstrap/0.log" Nov 22 04:40:06 crc kubenswrapper[4699]: I1122 04:40:06.782547 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_57084326-d72e-40cb-9905-ca75d50f51e3/mysql-bootstrap/0.log" Nov 22 04:40:06 crc kubenswrapper[4699]: I1122 04:40:06.810945 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e74585bc-d1cf-473d-95ca-12c816ff0020/galera/0.log" Nov 22 04:40:06 crc kubenswrapper[4699]: I1122 04:40:06.976113 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_57084326-d72e-40cb-9905-ca75d50f51e3/mysql-bootstrap/0.log" Nov 22 04:40:07 crc kubenswrapper[4699]: I1122 04:40:07.098313 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_b3f3d84b-ad88-4145-9e18-b2baa8eff9c4/openstackclient/0.log" Nov 22 04:40:07 crc kubenswrapper[4699]: I1122 04:40:07.100112 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_57084326-d72e-40cb-9905-ca75d50f51e3/galera/0.log" Nov 22 04:40:07 crc kubenswrapper[4699]: I1122 04:40:07.314057 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-j7b96_55053527-f2d2-4e44-8a9c-153b74ef3605/ovsdb-server-init/0.log" Nov 22 04:40:07 crc kubenswrapper[4699]: I1122 04:40:07.476959 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-x9j67_14d98ff4-07de-4764-a1a6-238316e83ee3/openstack-network-exporter/0.log" Nov 22 04:40:07 crc kubenswrapper[4699]: I1122 04:40:07.811222 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-j7b96_55053527-f2d2-4e44-8a9c-153b74ef3605/ovsdb-server-init/0.log" Nov 22 04:40:07 crc kubenswrapper[4699]: I1122 04:40:07.851359 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-j7b96_55053527-f2d2-4e44-8a9c-153b74ef3605/ovsdb-server/0.log" Nov 22 04:40:07 crc kubenswrapper[4699]: I1122 04:40:07.904469 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-j7b96_55053527-f2d2-4e44-8a9c-153b74ef3605/ovs-vswitchd/0.log" Nov 22 04:40:08 crc kubenswrapper[4699]: I1122 04:40:08.032454 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-s7mlz_0311366c-c8c7-449c-b617-213a4d87de00/ovn-controller/0.log" Nov 22 04:40:08 crc kubenswrapper[4699]: I1122 04:40:08.173776 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6022714c-eabe-49a9-b794-0b7a0097b816/openstack-network-exporter/0.log" Nov 22 04:40:08 crc kubenswrapper[4699]: I1122 04:40:08.212661 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6022714c-eabe-49a9-b794-0b7a0097b816/ovn-northd/0.log" Nov 22 04:40:08 crc kubenswrapper[4699]: I1122 04:40:08.380310 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa/openstack-network-exporter/0.log" Nov 22 04:40:08 crc kubenswrapper[4699]: I1122 04:40:08.499014 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa/ovsdbserver-nb/0.log" Nov 22 04:40:08 crc kubenswrapper[4699]: I1122 04:40:08.571143 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_3e31d684-0292-4e13-8bce-9af3fbcb09cb/openstack-network-exporter/0.log" Nov 22 04:40:08 crc kubenswrapper[4699]: I1122 04:40:08.676907 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_3e31d684-0292-4e13-8bce-9af3fbcb09cb/ovsdbserver-sb/0.log" Nov 22 04:40:08 crc kubenswrapper[4699]: I1122 04:40:08.754735 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-64688bf4db-vwnwg_11aab908-3152-4d7b-bfb3-b4f3e04bb7a8/placement-api/0.log" Nov 22 04:40:08 crc kubenswrapper[4699]: I1122 04:40:08.860833 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-64688bf4db-vwnwg_11aab908-3152-4d7b-bfb3-b4f3e04bb7a8/placement-log/0.log" Nov 22 04:40:08 crc kubenswrapper[4699]: I1122 04:40:08.963690 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_522fc300-2659-442f-9311-65aa82b05e99/setup-container/0.log" Nov 22 04:40:09 crc kubenswrapper[4699]: I1122 04:40:09.311870 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_522fc300-2659-442f-9311-65aa82b05e99/rabbitmq/0.log" Nov 22 04:40:09 crc kubenswrapper[4699]: I1122 04:40:09.327658 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8/setup-container/0.log" Nov 22 04:40:09 crc kubenswrapper[4699]: I1122 04:40:09.375849 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_522fc300-2659-442f-9311-65aa82b05e99/setup-container/0.log" Nov 22 04:40:09 crc kubenswrapper[4699]: I1122 04:40:09.572313 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8/setup-container/0.log" Nov 22 04:40:09 crc kubenswrapper[4699]: I1122 04:40:09.604529 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8/rabbitmq/0.log" Nov 22 04:40:09 crc kubenswrapper[4699]: I1122 04:40:09.643239 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5997b85577-gkwmz_f3eaea68-e2a0-4b59-961e-eebded9815b1/proxy-httpd/0.log" Nov 22 04:40:09 crc kubenswrapper[4699]: I1122 04:40:09.768140 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5997b85577-gkwmz_f3eaea68-e2a0-4b59-961e-eebded9815b1/proxy-server/0.log" Nov 22 04:40:09 crc kubenswrapper[4699]: I1122 04:40:09.913541 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-t9rdp_ed96c0b0-7b76-4f03-b352-461405bbfb23/swift-ring-rebalance/0.log" Nov 22 04:40:10 crc kubenswrapper[4699]: I1122 04:40:10.141169 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ff24634a-9171-4a5c-b045-4c653e032c18/account-auditor/0.log" Nov 22 04:40:10 crc kubenswrapper[4699]: I1122 04:40:10.141836 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ff24634a-9171-4a5c-b045-4c653e032c18/account-reaper/0.log" Nov 22 04:40:10 crc kubenswrapper[4699]: I1122 04:40:10.208810 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ff24634a-9171-4a5c-b045-4c653e032c18/account-replicator/0.log" Nov 22 04:40:10 crc kubenswrapper[4699]: I1122 04:40:10.268361 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ff24634a-9171-4a5c-b045-4c653e032c18/account-server/0.log" Nov 22 04:40:10 crc kubenswrapper[4699]: I1122 04:40:10.364554 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ff24634a-9171-4a5c-b045-4c653e032c18/container-replicator/0.log" Nov 22 04:40:10 crc kubenswrapper[4699]: I1122 04:40:10.381996 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ff24634a-9171-4a5c-b045-4c653e032c18/container-auditor/0.log" Nov 22 04:40:10 crc kubenswrapper[4699]: I1122 04:40:10.447718 4699 scope.go:117] "RemoveContainer" containerID="9f6c8a2daef4dc5617a6b47fc5d58598238dea049bba2ad09b65bd85f946e581" Nov 22 04:40:10 crc kubenswrapper[4699]: E1122 04:40:10.448062 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kjwnt_openshift-machine-config-operator(41bdbae2-706a-4f84-9f56-5a42aec77762)\"" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" Nov 22 04:40:10 crc kubenswrapper[4699]: I1122 04:40:10.449556 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ff24634a-9171-4a5c-b045-4c653e032c18/container-server/0.log" Nov 22 04:40:10 crc kubenswrapper[4699]: I1122 04:40:10.523733 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ff24634a-9171-4a5c-b045-4c653e032c18/container-updater/0.log" Nov 22 04:40:10 crc kubenswrapper[4699]: I1122 04:40:10.622764 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ff24634a-9171-4a5c-b045-4c653e032c18/object-auditor/0.log" Nov 22 04:40:10 crc kubenswrapper[4699]: I1122 04:40:10.667507 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ff24634a-9171-4a5c-b045-4c653e032c18/object-expirer/0.log" Nov 22 04:40:10 crc kubenswrapper[4699]: I1122 04:40:10.680034 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ff24634a-9171-4a5c-b045-4c653e032c18/object-replicator/0.log" Nov 22 04:40:10 crc kubenswrapper[4699]: I1122 04:40:10.814556 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ff24634a-9171-4a5c-b045-4c653e032c18/object-server/0.log" Nov 22 04:40:10 crc kubenswrapper[4699]: I1122 04:40:10.884677 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ff24634a-9171-4a5c-b045-4c653e032c18/object-updater/0.log" Nov 22 04:40:10 crc kubenswrapper[4699]: I1122 04:40:10.891723 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ff24634a-9171-4a5c-b045-4c653e032c18/rsync/0.log" Nov 22 04:40:10 crc kubenswrapper[4699]: I1122 04:40:10.967153 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ff24634a-9171-4a5c-b045-4c653e032c18/swift-recon-cron/0.log" Nov 22 04:40:11 crc kubenswrapper[4699]: I1122 04:40:11.704471 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_02e377d7-9e5a-45ec-9460-16af64ce3db5/memcached/0.log" Nov 22 04:40:25 crc kubenswrapper[4699]: I1122 04:40:25.448606 4699 scope.go:117] "RemoveContainer" containerID="9f6c8a2daef4dc5617a6b47fc5d58598238dea049bba2ad09b65bd85f946e581" Nov 22 04:40:25 crc kubenswrapper[4699]: E1122 04:40:25.449467 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kjwnt_openshift-machine-config-operator(41bdbae2-706a-4f84-9f56-5a42aec77762)\"" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" Nov 22 04:40:31 crc kubenswrapper[4699]: I1122 04:40:31.208674 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0702d9261541ca7ab33cc0f5bb569a6098e591a9e02db10dc12f9a2708fnb5w_d98e67ec-e732-4646-859f-5dcf61d03def/util/0.log" Nov 22 04:40:31 crc kubenswrapper[4699]: I1122 04:40:31.430997 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0702d9261541ca7ab33cc0f5bb569a6098e591a9e02db10dc12f9a2708fnb5w_d98e67ec-e732-4646-859f-5dcf61d03def/pull/0.log" Nov 22 04:40:31 crc kubenswrapper[4699]: I1122 04:40:31.473998 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0702d9261541ca7ab33cc0f5bb569a6098e591a9e02db10dc12f9a2708fnb5w_d98e67ec-e732-4646-859f-5dcf61d03def/pull/0.log" Nov 22 04:40:31 crc kubenswrapper[4699]: I1122 04:40:31.501630 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0702d9261541ca7ab33cc0f5bb569a6098e591a9e02db10dc12f9a2708fnb5w_d98e67ec-e732-4646-859f-5dcf61d03def/util/0.log" Nov 22 04:40:31 crc kubenswrapper[4699]: I1122 04:40:31.619152 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0702d9261541ca7ab33cc0f5bb569a6098e591a9e02db10dc12f9a2708fnb5w_d98e67ec-e732-4646-859f-5dcf61d03def/util/0.log" Nov 22 04:40:31 crc kubenswrapper[4699]: I1122 04:40:31.673089 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0702d9261541ca7ab33cc0f5bb569a6098e591a9e02db10dc12f9a2708fnb5w_d98e67ec-e732-4646-859f-5dcf61d03def/pull/0.log" Nov 22 04:40:31 crc kubenswrapper[4699]: I1122 04:40:31.689691 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0702d9261541ca7ab33cc0f5bb569a6098e591a9e02db10dc12f9a2708fnb5w_d98e67ec-e732-4646-859f-5dcf61d03def/extract/0.log" Nov 22 04:40:31 crc kubenswrapper[4699]: I1122 04:40:31.805377 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-75fb479bcc-qcvls_5cda144e-7465-4060-945a-89e3d288c551/kube-rbac-proxy/0.log" Nov 22 04:40:31 crc kubenswrapper[4699]: I1122 04:40:31.903986 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-75fb479bcc-qcvls_5cda144e-7465-4060-945a-89e3d288c551/manager/0.log" Nov 22 04:40:31 crc kubenswrapper[4699]: I1122 04:40:31.915566 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6498cbf48f-pq7wj_455f990d-3a21-4c84-8a9d-e4a4af10c47f/kube-rbac-proxy/0.log" Nov 22 04:40:32 crc kubenswrapper[4699]: I1122 04:40:32.025392 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6498cbf48f-pq7wj_455f990d-3a21-4c84-8a9d-e4a4af10c47f/manager/0.log" Nov 22 04:40:32 crc kubenswrapper[4699]: I1122 04:40:32.130887 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-767ccfd65f-6fd98_b4a26451-a994-4295-b354-46babc06a258/kube-rbac-proxy/0.log" Nov 22 04:40:32 crc kubenswrapper[4699]: I1122 04:40:32.134294 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-767ccfd65f-6fd98_b4a26451-a994-4295-b354-46babc06a258/manager/0.log" Nov 22 04:40:32 crc kubenswrapper[4699]: I1122 04:40:32.264915 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7969689c84-thd4s_18c1e29a-63b8-4973-92a9-87c5b0301565/kube-rbac-proxy/0.log" Nov 22 04:40:32 crc kubenswrapper[4699]: I1122 04:40:32.388374 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7969689c84-thd4s_18c1e29a-63b8-4973-92a9-87c5b0301565/manager/0.log" Nov 22 04:40:32 crc kubenswrapper[4699]: I1122 04:40:32.411167 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-56f54d6746-k9wzx_d0b96c5e-2c37-4f90-b933-2b8f8cbdfaf3/kube-rbac-proxy/0.log" Nov 22 04:40:32 crc kubenswrapper[4699]: I1122 04:40:32.463481 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-56f54d6746-k9wzx_d0b96c5e-2c37-4f90-b933-2b8f8cbdfaf3/manager/0.log" Nov 22 04:40:32 crc kubenswrapper[4699]: I1122 04:40:32.570924 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-598f69df5d-kd4w8_34a9f105-8024-4cc0-9ad2-14029731110d/manager/0.log" Nov 22 04:40:32 crc kubenswrapper[4699]: I1122 04:40:32.583966 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-598f69df5d-kd4w8_34a9f105-8024-4cc0-9ad2-14029731110d/kube-rbac-proxy/0.log" Nov 22 04:40:32 crc kubenswrapper[4699]: I1122 04:40:32.756228 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7875d8bb94-q9xzz_c6fc85a2-ca9a-4d87-811a-4ff8fcdcf477/kube-rbac-proxy/0.log" Nov 22 04:40:32 crc kubenswrapper[4699]: I1122 04:40:32.859401 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5d95d484b9-g8rz2_aaa73391-c097-4428-a43d-a5a4c1469419/kube-rbac-proxy/0.log" Nov 22 04:40:32 crc kubenswrapper[4699]: I1122 04:40:32.904403 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7875d8bb94-q9xzz_c6fc85a2-ca9a-4d87-811a-4ff8fcdcf477/manager/0.log" Nov 22 04:40:33 crc kubenswrapper[4699]: I1122 04:40:33.004331 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5d95d484b9-g8rz2_aaa73391-c097-4428-a43d-a5a4c1469419/manager/0.log" Nov 22 04:40:33 crc kubenswrapper[4699]: I1122 04:40:33.069217 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7454b96578-2dh9g_8ea3fa32-8451-4f8a-b395-98ce1382e116/kube-rbac-proxy/0.log" Nov 22 04:40:33 crc kubenswrapper[4699]: I1122 04:40:33.173840 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7454b96578-2dh9g_8ea3fa32-8451-4f8a-b395-98ce1382e116/manager/0.log" Nov 22 04:40:33 crc kubenswrapper[4699]: I1122 04:40:33.280762 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-58f887965d-qsszz_072faef9-c4a0-4bf9-84a8-fadca8945449/kube-rbac-proxy/0.log" Nov 22 04:40:33 crc kubenswrapper[4699]: I1122 04:40:33.462328 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-58f887965d-qsszz_072faef9-c4a0-4bf9-84a8-fadca8945449/manager/0.log" Nov 22 04:40:33 crc kubenswrapper[4699]: I1122 04:40:33.583234 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-54b5986bb8-zd2r8_aca6ad44-aa04-4178-ab59-bfdec68e49e7/kube-rbac-proxy/0.log" Nov 22 04:40:33 crc kubenswrapper[4699]: I1122 04:40:33.638922 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-54b5986bb8-zd2r8_aca6ad44-aa04-4178-ab59-bfdec68e49e7/manager/0.log" Nov 22 04:40:33 crc kubenswrapper[4699]: I1122 04:40:33.736928 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-78bd47f458-v289m_91f34dc0-cd1e-4f25-90e8-cbcb4ae2d930/kube-rbac-proxy/0.log" Nov 22 04:40:33 crc kubenswrapper[4699]: I1122 04:40:33.810951 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-78bd47f458-v289m_91f34dc0-cd1e-4f25-90e8-cbcb4ae2d930/manager/0.log" Nov 22 04:40:33 crc kubenswrapper[4699]: I1122 04:40:33.844118 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-cfbb9c588-qlcpr_e8f34ea0-681d-4a19-b9c9-0c230a7261e3/kube-rbac-proxy/0.log" Nov 22 04:40:34 crc kubenswrapper[4699]: I1122 04:40:34.004212 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-cfbb9c588-qlcpr_e8f34ea0-681d-4a19-b9c9-0c230a7261e3/manager/0.log" Nov 22 04:40:34 crc kubenswrapper[4699]: I1122 04:40:34.078711 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-54cfbf4c7d-7rck7_36905f20-0246-46f5-921a-2d18b2db8bdd/kube-rbac-proxy/0.log" Nov 22 04:40:34 crc kubenswrapper[4699]: I1122 04:40:34.123677 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-54cfbf4c7d-7rck7_36905f20-0246-46f5-921a-2d18b2db8bdd/manager/0.log" Nov 22 04:40:34 crc kubenswrapper[4699]: I1122 04:40:34.230192 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-8c7444f48-qcxmj_913c840a-25a6-46f9-bd06-e379438a5292/kube-rbac-proxy/0.log" Nov 22 04:40:34 crc kubenswrapper[4699]: I1122 04:40:34.273300 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-8c7444f48-qcxmj_913c840a-25a6-46f9-bd06-e379438a5292/manager/0.log" Nov 22 04:40:34 crc kubenswrapper[4699]: I1122 04:40:34.378924 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-66b5f67bb4-9h4ls_07e128b3-5973-437b-b7ec-80177dacf14f/kube-rbac-proxy/0.log" Nov 22 04:40:34 crc kubenswrapper[4699]: I1122 04:40:34.658826 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-655bc68c75-ttb9l_3a16ffb8-61fd-4e05-ac7b-277eb20e4f4c/kube-rbac-proxy/0.log" Nov 22 04:40:34 crc kubenswrapper[4699]: I1122 04:40:34.739424 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-655bc68c75-ttb9l_3a16ffb8-61fd-4e05-ac7b-277eb20e4f4c/operator/0.log" Nov 22 04:40:34 crc kubenswrapper[4699]: I1122 04:40:34.916018 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-wnnv6_0bb24428-cae6-49f4-b4d7-5a33488d5e2e/registry-server/0.log" Nov 22 04:40:35 crc kubenswrapper[4699]: I1122 04:40:35.021563 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-54fc5f65b7-j5b4z_7199810d-9e13-4ed5-a4bc-46c874551678/kube-rbac-proxy/0.log" Nov 22 04:40:35 crc kubenswrapper[4699]: I1122 04:40:35.187164 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-66b5f67bb4-9h4ls_07e128b3-5973-437b-b7ec-80177dacf14f/manager/0.log" Nov 22 04:40:35 crc kubenswrapper[4699]: I1122 04:40:35.187300 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5b797b8dff-fpcpg_0669449f-4e6b-4ab7-90ee-f8d93286db7a/kube-rbac-proxy/0.log" Nov 22 04:40:35 crc kubenswrapper[4699]: I1122 04:40:35.315046 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5b797b8dff-fpcpg_0669449f-4e6b-4ab7-90ee-f8d93286db7a/manager/0.log" Nov 22 04:40:35 crc kubenswrapper[4699]: I1122 04:40:35.360733 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-54fc5f65b7-j5b4z_7199810d-9e13-4ed5-a4bc-46c874551678/manager/0.log" Nov 22 04:40:35 crc kubenswrapper[4699]: I1122 04:40:35.419207 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-rhx7c_4fb724ba-7502-41eb-aab0-40eacbcd652e/operator/0.log" Nov 22 04:40:35 crc kubenswrapper[4699]: I1122 04:40:35.551359 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-d656998f4-fcjt6_a67d0761-3d62-4e25-80bc-cf6fac86cf0b/kube-rbac-proxy/0.log" Nov 22 04:40:35 crc kubenswrapper[4699]: I1122 04:40:35.573598 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-d656998f4-fcjt6_a67d0761-3d62-4e25-80bc-cf6fac86cf0b/manager/0.log" Nov 22 04:40:35 crc kubenswrapper[4699]: I1122 04:40:35.612666 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6d4bf84b58-lxncl_0dc61afc-07cb-46af-afb8-4c0bf3bc84f0/kube-rbac-proxy/0.log" Nov 22 04:40:35 crc kubenswrapper[4699]: I1122 04:40:35.773169 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6d4bf84b58-lxncl_0dc61afc-07cb-46af-afb8-4c0bf3bc84f0/manager/0.log" Nov 22 04:40:35 crc kubenswrapper[4699]: I1122 04:40:35.803860 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-b4c496f69-j5w8t_1cf7d81b-c0df-48d7-9b01-b7185a803ac6/manager/0.log" Nov 22 04:40:35 crc kubenswrapper[4699]: I1122 04:40:35.817618 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-b4c496f69-j5w8t_1cf7d81b-c0df-48d7-9b01-b7185a803ac6/kube-rbac-proxy/0.log" Nov 22 04:40:35 crc kubenswrapper[4699]: I1122 04:40:35.944761 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-8c6448b9f-t4rhb_482b57cb-741a-4062-9479-2a41febc67af/kube-rbac-proxy/0.log" Nov 22 04:40:35 crc kubenswrapper[4699]: I1122 04:40:35.946267 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-8c6448b9f-t4rhb_482b57cb-741a-4062-9479-2a41febc67af/manager/0.log" Nov 22 04:40:37 crc kubenswrapper[4699]: I1122 04:40:37.156705 4699 scope.go:117] "RemoveContainer" containerID="2c5de007d8fb94ac8a47a48b15acdd5cbfebbea9818405299e4042a67be3a215" Nov 22 04:40:37 crc kubenswrapper[4699]: I1122 04:40:37.210793 4699 scope.go:117] "RemoveContainer" containerID="4f8c36580a32469f7b1a2b1c6ff3454a6d29bd61b38ca805d1c2fc218f05457c" Nov 22 04:40:37 crc kubenswrapper[4699]: I1122 04:40:37.246601 4699 scope.go:117] "RemoveContainer" containerID="624b463938f550cc17b4e670673bf66a20b8b7cf86c27c5f9a60406c37b18761" Nov 22 04:40:37 crc kubenswrapper[4699]: I1122 04:40:37.449566 4699 scope.go:117] "RemoveContainer" containerID="9f6c8a2daef4dc5617a6b47fc5d58598238dea049bba2ad09b65bd85f946e581" Nov 22 04:40:37 crc kubenswrapper[4699]: E1122 04:40:37.449956 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kjwnt_openshift-machine-config-operator(41bdbae2-706a-4f84-9f56-5a42aec77762)\"" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" Nov 22 04:40:45 crc kubenswrapper[4699]: I1122 04:40:45.038386 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-s776x"] Nov 22 04:40:45 crc kubenswrapper[4699]: I1122 04:40:45.045555 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-s776x"] Nov 22 04:40:45 crc kubenswrapper[4699]: I1122 04:40:45.460691 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5273f2af-0355-484d-a907-589de1193a32" path="/var/lib/kubelet/pods/5273f2af-0355-484d-a907-589de1193a32/volumes" Nov 22 04:40:51 crc kubenswrapper[4699]: I1122 04:40:51.131736 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-w5jfl_7343df3b-7616-42dd-8e27-5f9a2031a8d9/control-plane-machine-set-operator/0.log" Nov 22 04:40:51 crc kubenswrapper[4699]: I1122 04:40:51.321743 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-jlk5m_07ed42bd-25e2-43de-bbd7-431ab818b761/kube-rbac-proxy/0.log" Nov 22 04:40:51 crc kubenswrapper[4699]: I1122 04:40:51.365533 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-jlk5m_07ed42bd-25e2-43de-bbd7-431ab818b761/machine-api-operator/0.log" Nov 22 04:40:52 crc kubenswrapper[4699]: I1122 04:40:52.448243 4699 scope.go:117] "RemoveContainer" containerID="9f6c8a2daef4dc5617a6b47fc5d58598238dea049bba2ad09b65bd85f946e581" Nov 22 04:40:53 crc kubenswrapper[4699]: I1122 04:40:53.529981 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" event={"ID":"41bdbae2-706a-4f84-9f56-5a42aec77762","Type":"ContainerStarted","Data":"3c295cef383260f09250dce73dcd78e9753722e5852bd1dba0e9d0043c5a2324"} Nov 22 04:41:02 crc kubenswrapper[4699]: I1122 04:41:02.644998 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-ggffg_2620c9cc-4041-49f0-bd0a-2b227e8214d6/cert-manager-controller/0.log" Nov 22 04:41:02 crc kubenswrapper[4699]: I1122 04:41:02.776206 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-mcllz_69760a96-2f1a-4eca-8bc1-9734e255c260/cert-manager-cainjector/0.log" Nov 22 04:41:02 crc kubenswrapper[4699]: I1122 04:41:02.826684 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-x8jj9_bdbcc3a3-05f4-4840-98d7-1e2417a9ad0b/cert-manager-webhook/0.log" Nov 22 04:41:14 crc kubenswrapper[4699]: I1122 04:41:14.686112 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5874bd7bc5-r7xcp_55b4d3a4-d0be-4184-b9ee-efedf1c27608/nmstate-console-plugin/0.log" Nov 22 04:41:14 crc kubenswrapper[4699]: I1122 04:41:14.921775 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-5dcf9c57c5-r67mk_0e1f9c73-89fc-4ab2-aca3-004315167c79/nmstate-metrics/0.log" Nov 22 04:41:14 crc kubenswrapper[4699]: I1122 04:41:14.925557 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-429gd_558d67eb-e4e4-46ca-bc65-8e4d568f4037/nmstate-handler/0.log" Nov 22 04:41:14 crc kubenswrapper[4699]: I1122 04:41:14.941406 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-5dcf9c57c5-r67mk_0e1f9c73-89fc-4ab2-aca3-004315167c79/kube-rbac-proxy/0.log" Nov 22 04:41:15 crc kubenswrapper[4699]: I1122 04:41:15.070850 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-557fdffb88-6nj2r_a7e3ac11-e456-48a4-ad00-114a41462661/nmstate-operator/0.log" Nov 22 04:41:15 crc kubenswrapper[4699]: I1122 04:41:15.108189 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6b89b748d8-7qwpr_0d107f7a-e965-41e9-8ceb-5c5ac1c3b530/nmstate-webhook/0.log" Nov 22 04:41:28 crc kubenswrapper[4699]: I1122 04:41:28.767881 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6c7b4b5f48-g47xp_dae7dee7-2390-47bc-83c9-488f48a4cc90/kube-rbac-proxy/0.log" Nov 22 04:41:28 crc kubenswrapper[4699]: I1122 04:41:28.817566 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6c7b4b5f48-g47xp_dae7dee7-2390-47bc-83c9-488f48a4cc90/controller/0.log" Nov 22 04:41:28 crc kubenswrapper[4699]: I1122 04:41:28.939372 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjgk6_2840ab61-4c34-4132-970e-c6d8c615c2bd/cp-frr-files/0.log" Nov 22 04:41:29 crc kubenswrapper[4699]: I1122 04:41:29.038595 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjgk6_2840ab61-4c34-4132-970e-c6d8c615c2bd/cp-frr-files/0.log" Nov 22 04:41:29 crc kubenswrapper[4699]: I1122 04:41:29.043600 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjgk6_2840ab61-4c34-4132-970e-c6d8c615c2bd/cp-reloader/0.log" Nov 22 04:41:29 crc kubenswrapper[4699]: I1122 04:41:29.093242 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjgk6_2840ab61-4c34-4132-970e-c6d8c615c2bd/cp-metrics/0.log" Nov 22 04:41:29 crc kubenswrapper[4699]: I1122 04:41:29.136006 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjgk6_2840ab61-4c34-4132-970e-c6d8c615c2bd/cp-reloader/0.log" Nov 22 04:41:29 crc kubenswrapper[4699]: I1122 04:41:29.326177 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjgk6_2840ab61-4c34-4132-970e-c6d8c615c2bd/cp-reloader/0.log" Nov 22 04:41:29 crc kubenswrapper[4699]: I1122 04:41:29.329385 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjgk6_2840ab61-4c34-4132-970e-c6d8c615c2bd/cp-frr-files/0.log" Nov 22 04:41:29 crc kubenswrapper[4699]: I1122 04:41:29.337474 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjgk6_2840ab61-4c34-4132-970e-c6d8c615c2bd/cp-metrics/0.log" Nov 22 04:41:29 crc kubenswrapper[4699]: I1122 04:41:29.364733 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjgk6_2840ab61-4c34-4132-970e-c6d8c615c2bd/cp-metrics/0.log" Nov 22 04:41:29 crc kubenswrapper[4699]: I1122 04:41:29.508685 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjgk6_2840ab61-4c34-4132-970e-c6d8c615c2bd/cp-frr-files/0.log" Nov 22 04:41:29 crc kubenswrapper[4699]: I1122 04:41:29.545545 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjgk6_2840ab61-4c34-4132-970e-c6d8c615c2bd/cp-reloader/0.log" Nov 22 04:41:29 crc kubenswrapper[4699]: I1122 04:41:29.551589 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjgk6_2840ab61-4c34-4132-970e-c6d8c615c2bd/cp-metrics/0.log" Nov 22 04:41:29 crc kubenswrapper[4699]: I1122 04:41:29.566349 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjgk6_2840ab61-4c34-4132-970e-c6d8c615c2bd/controller/0.log" Nov 22 04:41:29 crc kubenswrapper[4699]: I1122 04:41:29.740636 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjgk6_2840ab61-4c34-4132-970e-c6d8c615c2bd/kube-rbac-proxy-frr/0.log" Nov 22 04:41:29 crc kubenswrapper[4699]: I1122 04:41:29.745690 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjgk6_2840ab61-4c34-4132-970e-c6d8c615c2bd/frr-metrics/0.log" Nov 22 04:41:29 crc kubenswrapper[4699]: I1122 04:41:29.765104 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjgk6_2840ab61-4c34-4132-970e-c6d8c615c2bd/kube-rbac-proxy/0.log" Nov 22 04:41:29 crc kubenswrapper[4699]: I1122 04:41:29.963508 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-6998585d5-5tpp7_3975d03a-cd82-4ae3-89cb-fcad5f75330c/frr-k8s-webhook-server/0.log" Nov 22 04:41:29 crc kubenswrapper[4699]: I1122 04:41:29.966570 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjgk6_2840ab61-4c34-4132-970e-c6d8c615c2bd/reloader/0.log" Nov 22 04:41:30 crc kubenswrapper[4699]: I1122 04:41:30.171754 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7c65d8d687-6vpd9_f195708e-47e9-45a0-8361-7bbe6b6c6c0b/manager/0.log" Nov 22 04:41:30 crc kubenswrapper[4699]: I1122 04:41:30.374086 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-785654ff4c-blk7v_fbc128ff-e74b-44ee-a1a0-553a38bc79c7/webhook-server/0.log" Nov 22 04:41:30 crc kubenswrapper[4699]: I1122 04:41:30.479840 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-psxdv_07114818-b4f9-465d-9745-c8a05af60e5a/kube-rbac-proxy/0.log" Nov 22 04:41:30 crc kubenswrapper[4699]: I1122 04:41:30.793183 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjgk6_2840ab61-4c34-4132-970e-c6d8c615c2bd/frr/0.log" Nov 22 04:41:30 crc kubenswrapper[4699]: I1122 04:41:30.973819 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-psxdv_07114818-b4f9-465d-9745-c8a05af60e5a/speaker/0.log" Nov 22 04:41:37 crc kubenswrapper[4699]: I1122 04:41:37.410304 4699 scope.go:117] "RemoveContainer" containerID="9c3808a684aff428d27566e481a7cbb608f3db49ad4dc5f9d4586582693fc445" Nov 22 04:41:42 crc kubenswrapper[4699]: I1122 04:41:42.696710 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr_724005e9-061b-46d4-84ce-611d0ddaa0e5/util/0.log" Nov 22 04:41:42 crc kubenswrapper[4699]: I1122 04:41:42.829326 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr_724005e9-061b-46d4-84ce-611d0ddaa0e5/util/0.log" Nov 22 04:41:42 crc kubenswrapper[4699]: I1122 04:41:42.848189 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr_724005e9-061b-46d4-84ce-611d0ddaa0e5/pull/0.log" Nov 22 04:41:42 crc kubenswrapper[4699]: I1122 04:41:42.848918 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr_724005e9-061b-46d4-84ce-611d0ddaa0e5/pull/0.log" Nov 22 04:41:43 crc kubenswrapper[4699]: I1122 04:41:43.037444 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr_724005e9-061b-46d4-84ce-611d0ddaa0e5/util/0.log" Nov 22 04:41:43 crc kubenswrapper[4699]: I1122 04:41:43.046895 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr_724005e9-061b-46d4-84ce-611d0ddaa0e5/pull/0.log" Nov 22 04:41:43 crc kubenswrapper[4699]: I1122 04:41:43.073190 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr_724005e9-061b-46d4-84ce-611d0ddaa0e5/extract/0.log" Nov 22 04:41:43 crc kubenswrapper[4699]: I1122 04:41:43.219691 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vzxlt_9aa53af0-a1c6-48c6-b081-e100d6a6512f/extract-utilities/0.log" Nov 22 04:41:43 crc kubenswrapper[4699]: I1122 04:41:43.377354 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vzxlt_9aa53af0-a1c6-48c6-b081-e100d6a6512f/extract-utilities/0.log" Nov 22 04:41:43 crc kubenswrapper[4699]: I1122 04:41:43.377474 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vzxlt_9aa53af0-a1c6-48c6-b081-e100d6a6512f/extract-content/0.log" Nov 22 04:41:43 crc kubenswrapper[4699]: I1122 04:41:43.405867 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vzxlt_9aa53af0-a1c6-48c6-b081-e100d6a6512f/extract-content/0.log" Nov 22 04:41:43 crc kubenswrapper[4699]: I1122 04:41:43.620166 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vzxlt_9aa53af0-a1c6-48c6-b081-e100d6a6512f/extract-content/0.log" Nov 22 04:41:43 crc kubenswrapper[4699]: I1122 04:41:43.641880 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vzxlt_9aa53af0-a1c6-48c6-b081-e100d6a6512f/extract-utilities/0.log" Nov 22 04:41:43 crc kubenswrapper[4699]: I1122 04:41:43.871664 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wzwkb_4e5709de-6870-45ee-979d-4cf3c01d2b20/extract-utilities/0.log" Nov 22 04:41:43 crc kubenswrapper[4699]: I1122 04:41:43.879605 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vzxlt_9aa53af0-a1c6-48c6-b081-e100d6a6512f/registry-server/0.log" Nov 22 04:41:44 crc kubenswrapper[4699]: I1122 04:41:44.034123 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wzwkb_4e5709de-6870-45ee-979d-4cf3c01d2b20/extract-utilities/0.log" Nov 22 04:41:44 crc kubenswrapper[4699]: I1122 04:41:44.052167 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wzwkb_4e5709de-6870-45ee-979d-4cf3c01d2b20/extract-content/0.log" Nov 22 04:41:44 crc kubenswrapper[4699]: I1122 04:41:44.069772 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wzwkb_4e5709de-6870-45ee-979d-4cf3c01d2b20/extract-content/0.log" Nov 22 04:41:44 crc kubenswrapper[4699]: I1122 04:41:44.212482 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wzwkb_4e5709de-6870-45ee-979d-4cf3c01d2b20/extract-content/0.log" Nov 22 04:41:44 crc kubenswrapper[4699]: I1122 04:41:44.228961 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wzwkb_4e5709de-6870-45ee-979d-4cf3c01d2b20/extract-utilities/0.log" Nov 22 04:41:44 crc kubenswrapper[4699]: I1122 04:41:44.430946 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6mmt7g_46bd6dfa-3553-4128-9412-a6d995e86f82/util/0.log" Nov 22 04:41:44 crc kubenswrapper[4699]: I1122 04:41:44.578833 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6mmt7g_46bd6dfa-3553-4128-9412-a6d995e86f82/util/0.log" Nov 22 04:41:44 crc kubenswrapper[4699]: I1122 04:41:44.653924 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wzwkb_4e5709de-6870-45ee-979d-4cf3c01d2b20/registry-server/0.log" Nov 22 04:41:44 crc kubenswrapper[4699]: I1122 04:41:44.678069 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6mmt7g_46bd6dfa-3553-4128-9412-a6d995e86f82/pull/0.log" Nov 22 04:41:44 crc kubenswrapper[4699]: I1122 04:41:44.678807 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6mmt7g_46bd6dfa-3553-4128-9412-a6d995e86f82/pull/0.log" Nov 22 04:41:44 crc kubenswrapper[4699]: I1122 04:41:44.815806 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6mmt7g_46bd6dfa-3553-4128-9412-a6d995e86f82/pull/0.log" Nov 22 04:41:44 crc kubenswrapper[4699]: I1122 04:41:44.850789 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6mmt7g_46bd6dfa-3553-4128-9412-a6d995e86f82/extract/0.log" Nov 22 04:41:44 crc kubenswrapper[4699]: I1122 04:41:44.869346 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6mmt7g_46bd6dfa-3553-4128-9412-a6d995e86f82/util/0.log" Nov 22 04:41:45 crc kubenswrapper[4699]: I1122 04:41:45.008163 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-8pl4v_cfa86e4d-ee3e-4839-af4e-966184a73dc9/marketplace-operator/0.log" Nov 22 04:41:45 crc kubenswrapper[4699]: I1122 04:41:45.040177 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-js84l_096fc045-af3a-4dff-bfb9-aad031dc0cc0/extract-utilities/0.log" Nov 22 04:41:45 crc kubenswrapper[4699]: I1122 04:41:45.217002 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-js84l_096fc045-af3a-4dff-bfb9-aad031dc0cc0/extract-utilities/0.log" Nov 22 04:41:45 crc kubenswrapper[4699]: I1122 04:41:45.237357 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-js84l_096fc045-af3a-4dff-bfb9-aad031dc0cc0/extract-content/0.log" Nov 22 04:41:45 crc kubenswrapper[4699]: I1122 04:41:45.243662 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-js84l_096fc045-af3a-4dff-bfb9-aad031dc0cc0/extract-content/0.log" Nov 22 04:41:45 crc kubenswrapper[4699]: I1122 04:41:45.432606 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-js84l_096fc045-af3a-4dff-bfb9-aad031dc0cc0/extract-content/0.log" Nov 22 04:41:45 crc kubenswrapper[4699]: I1122 04:41:45.444372 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-js84l_096fc045-af3a-4dff-bfb9-aad031dc0cc0/extract-utilities/0.log" Nov 22 04:41:45 crc kubenswrapper[4699]: I1122 04:41:45.531502 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-js84l_096fc045-af3a-4dff-bfb9-aad031dc0cc0/registry-server/0.log" Nov 22 04:41:45 crc kubenswrapper[4699]: I1122 04:41:45.624683 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wdh9z_8ee0f1f9-a840-4cb2-828e-99b87f87d60e/extract-utilities/0.log" Nov 22 04:41:45 crc kubenswrapper[4699]: I1122 04:41:45.829168 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wdh9z_8ee0f1f9-a840-4cb2-828e-99b87f87d60e/extract-utilities/0.log" Nov 22 04:41:45 crc kubenswrapper[4699]: I1122 04:41:45.839734 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wdh9z_8ee0f1f9-a840-4cb2-828e-99b87f87d60e/extract-content/0.log" Nov 22 04:41:45 crc kubenswrapper[4699]: I1122 04:41:45.850172 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wdh9z_8ee0f1f9-a840-4cb2-828e-99b87f87d60e/extract-content/0.log" Nov 22 04:41:46 crc kubenswrapper[4699]: I1122 04:41:46.010568 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wdh9z_8ee0f1f9-a840-4cb2-828e-99b87f87d60e/extract-utilities/0.log" Nov 22 04:41:46 crc kubenswrapper[4699]: I1122 04:41:46.041446 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wdh9z_8ee0f1f9-a840-4cb2-828e-99b87f87d60e/extract-content/0.log" Nov 22 04:41:46 crc kubenswrapper[4699]: I1122 04:41:46.315536 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wdh9z_8ee0f1f9-a840-4cb2-828e-99b87f87d60e/registry-server/0.log" Nov 22 04:42:15 crc kubenswrapper[4699]: E1122 04:42:15.423771 4699 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.136:52692->38.102.83.136:39863: write tcp 38.102.83.136:52692->38.102.83.136:39863: write: connection reset by peer Nov 22 04:43:06 crc kubenswrapper[4699]: I1122 04:43:06.273058 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-f6sqt"] Nov 22 04:43:06 crc kubenswrapper[4699]: E1122 04:43:06.274135 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7101057-977b-48ad-952c-beb8f89cba64" containerName="registry-server" Nov 22 04:43:06 crc kubenswrapper[4699]: I1122 04:43:06.274150 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7101057-977b-48ad-952c-beb8f89cba64" containerName="registry-server" Nov 22 04:43:06 crc kubenswrapper[4699]: E1122 04:43:06.274171 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7101057-977b-48ad-952c-beb8f89cba64" containerName="extract-content" Nov 22 04:43:06 crc kubenswrapper[4699]: I1122 04:43:06.274177 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7101057-977b-48ad-952c-beb8f89cba64" containerName="extract-content" Nov 22 04:43:06 crc kubenswrapper[4699]: E1122 04:43:06.274214 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7101057-977b-48ad-952c-beb8f89cba64" containerName="extract-utilities" Nov 22 04:43:06 crc kubenswrapper[4699]: I1122 04:43:06.274221 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7101057-977b-48ad-952c-beb8f89cba64" containerName="extract-utilities" Nov 22 04:43:06 crc kubenswrapper[4699]: E1122 04:43:06.274238 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cbbb1af-c265-4fd4-81ba-51a83ae30668" containerName="container-00" Nov 22 04:43:06 crc kubenswrapper[4699]: I1122 04:43:06.274243 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cbbb1af-c265-4fd4-81ba-51a83ae30668" containerName="container-00" Nov 22 04:43:06 crc kubenswrapper[4699]: I1122 04:43:06.274416 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cbbb1af-c265-4fd4-81ba-51a83ae30668" containerName="container-00" Nov 22 04:43:06 crc kubenswrapper[4699]: I1122 04:43:06.276179 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7101057-977b-48ad-952c-beb8f89cba64" containerName="registry-server" Nov 22 04:43:06 crc kubenswrapper[4699]: I1122 04:43:06.277495 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f6sqt" Nov 22 04:43:06 crc kubenswrapper[4699]: I1122 04:43:06.304346 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f6sqt"] Nov 22 04:43:06 crc kubenswrapper[4699]: I1122 04:43:06.446004 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72a71f7c-1bcf-4d02-98f6-ddef2b10672a-utilities\") pod \"redhat-marketplace-f6sqt\" (UID: \"72a71f7c-1bcf-4d02-98f6-ddef2b10672a\") " pod="openshift-marketplace/redhat-marketplace-f6sqt" Nov 22 04:43:06 crc kubenswrapper[4699]: I1122 04:43:06.446110 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72a71f7c-1bcf-4d02-98f6-ddef2b10672a-catalog-content\") pod \"redhat-marketplace-f6sqt\" (UID: \"72a71f7c-1bcf-4d02-98f6-ddef2b10672a\") " pod="openshift-marketplace/redhat-marketplace-f6sqt" Nov 22 04:43:06 crc kubenswrapper[4699]: I1122 04:43:06.446170 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klcdc\" (UniqueName: \"kubernetes.io/projected/72a71f7c-1bcf-4d02-98f6-ddef2b10672a-kube-api-access-klcdc\") pod \"redhat-marketplace-f6sqt\" (UID: \"72a71f7c-1bcf-4d02-98f6-ddef2b10672a\") " pod="openshift-marketplace/redhat-marketplace-f6sqt" Nov 22 04:43:06 crc kubenswrapper[4699]: I1122 04:43:06.548188 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72a71f7c-1bcf-4d02-98f6-ddef2b10672a-utilities\") pod \"redhat-marketplace-f6sqt\" (UID: \"72a71f7c-1bcf-4d02-98f6-ddef2b10672a\") " pod="openshift-marketplace/redhat-marketplace-f6sqt" Nov 22 04:43:06 crc kubenswrapper[4699]: I1122 04:43:06.548812 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72a71f7c-1bcf-4d02-98f6-ddef2b10672a-utilities\") pod \"redhat-marketplace-f6sqt\" (UID: \"72a71f7c-1bcf-4d02-98f6-ddef2b10672a\") " pod="openshift-marketplace/redhat-marketplace-f6sqt" Nov 22 04:43:06 crc kubenswrapper[4699]: I1122 04:43:06.549668 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72a71f7c-1bcf-4d02-98f6-ddef2b10672a-catalog-content\") pod \"redhat-marketplace-f6sqt\" (UID: \"72a71f7c-1bcf-4d02-98f6-ddef2b10672a\") " pod="openshift-marketplace/redhat-marketplace-f6sqt" Nov 22 04:43:06 crc kubenswrapper[4699]: I1122 04:43:06.550554 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72a71f7c-1bcf-4d02-98f6-ddef2b10672a-catalog-content\") pod \"redhat-marketplace-f6sqt\" (UID: \"72a71f7c-1bcf-4d02-98f6-ddef2b10672a\") " pod="openshift-marketplace/redhat-marketplace-f6sqt" Nov 22 04:43:06 crc kubenswrapper[4699]: I1122 04:43:06.551270 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klcdc\" (UniqueName: \"kubernetes.io/projected/72a71f7c-1bcf-4d02-98f6-ddef2b10672a-kube-api-access-klcdc\") pod \"redhat-marketplace-f6sqt\" (UID: \"72a71f7c-1bcf-4d02-98f6-ddef2b10672a\") " pod="openshift-marketplace/redhat-marketplace-f6sqt" Nov 22 04:43:06 crc kubenswrapper[4699]: I1122 04:43:06.574480 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klcdc\" (UniqueName: \"kubernetes.io/projected/72a71f7c-1bcf-4d02-98f6-ddef2b10672a-kube-api-access-klcdc\") pod \"redhat-marketplace-f6sqt\" (UID: \"72a71f7c-1bcf-4d02-98f6-ddef2b10672a\") " pod="openshift-marketplace/redhat-marketplace-f6sqt" Nov 22 04:43:06 crc kubenswrapper[4699]: I1122 04:43:06.630013 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f6sqt" Nov 22 04:43:07 crc kubenswrapper[4699]: I1122 04:43:07.959406 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f6sqt"] Nov 22 04:43:08 crc kubenswrapper[4699]: I1122 04:43:08.672508 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wnt4l"] Nov 22 04:43:08 crc kubenswrapper[4699]: I1122 04:43:08.679578 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wnt4l" Nov 22 04:43:08 crc kubenswrapper[4699]: I1122 04:43:08.697233 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wnt4l"] Nov 22 04:43:08 crc kubenswrapper[4699]: I1122 04:43:08.726647 4699 patch_prober.go:28] interesting pod/machine-config-daemon-kjwnt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:43:08 crc kubenswrapper[4699]: I1122 04:43:08.726719 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:43:08 crc kubenswrapper[4699]: I1122 04:43:08.778004 4699 generic.go:334] "Generic (PLEG): container finished" podID="72a71f7c-1bcf-4d02-98f6-ddef2b10672a" containerID="2fe4ace897b5f2d1a71767665a19290e733d5b4e837efd7973a5d62d919c252d" exitCode=0 Nov 22 04:43:08 crc kubenswrapper[4699]: I1122 04:43:08.778046 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f6sqt" event={"ID":"72a71f7c-1bcf-4d02-98f6-ddef2b10672a","Type":"ContainerDied","Data":"2fe4ace897b5f2d1a71767665a19290e733d5b4e837efd7973a5d62d919c252d"} Nov 22 04:43:08 crc kubenswrapper[4699]: I1122 04:43:08.778069 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f6sqt" event={"ID":"72a71f7c-1bcf-4d02-98f6-ddef2b10672a","Type":"ContainerStarted","Data":"6f2e71bc25876b40be1bc8ca1fdda4e695fe331114aa841de83875f0a38738cb"} Nov 22 04:43:08 crc kubenswrapper[4699]: I1122 04:43:08.797230 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/649436e7-cf4f-4855-836a-a688d0f89f93-catalog-content\") pod \"community-operators-wnt4l\" (UID: \"649436e7-cf4f-4855-836a-a688d0f89f93\") " pod="openshift-marketplace/community-operators-wnt4l" Nov 22 04:43:08 crc kubenswrapper[4699]: I1122 04:43:08.797309 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/649436e7-cf4f-4855-836a-a688d0f89f93-utilities\") pod \"community-operators-wnt4l\" (UID: \"649436e7-cf4f-4855-836a-a688d0f89f93\") " pod="openshift-marketplace/community-operators-wnt4l" Nov 22 04:43:08 crc kubenswrapper[4699]: I1122 04:43:08.797367 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bp97\" (UniqueName: \"kubernetes.io/projected/649436e7-cf4f-4855-836a-a688d0f89f93-kube-api-access-4bp97\") pod \"community-operators-wnt4l\" (UID: \"649436e7-cf4f-4855-836a-a688d0f89f93\") " pod="openshift-marketplace/community-operators-wnt4l" Nov 22 04:43:08 crc kubenswrapper[4699]: I1122 04:43:08.899784 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/649436e7-cf4f-4855-836a-a688d0f89f93-catalog-content\") pod \"community-operators-wnt4l\" (UID: \"649436e7-cf4f-4855-836a-a688d0f89f93\") " pod="openshift-marketplace/community-operators-wnt4l" Nov 22 04:43:08 crc kubenswrapper[4699]: I1122 04:43:08.900246 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/649436e7-cf4f-4855-836a-a688d0f89f93-utilities\") pod \"community-operators-wnt4l\" (UID: \"649436e7-cf4f-4855-836a-a688d0f89f93\") " pod="openshift-marketplace/community-operators-wnt4l" Nov 22 04:43:08 crc kubenswrapper[4699]: I1122 04:43:08.900340 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bp97\" (UniqueName: \"kubernetes.io/projected/649436e7-cf4f-4855-836a-a688d0f89f93-kube-api-access-4bp97\") pod \"community-operators-wnt4l\" (UID: \"649436e7-cf4f-4855-836a-a688d0f89f93\") " pod="openshift-marketplace/community-operators-wnt4l" Nov 22 04:43:08 crc kubenswrapper[4699]: I1122 04:43:08.900514 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/649436e7-cf4f-4855-836a-a688d0f89f93-catalog-content\") pod \"community-operators-wnt4l\" (UID: \"649436e7-cf4f-4855-836a-a688d0f89f93\") " pod="openshift-marketplace/community-operators-wnt4l" Nov 22 04:43:08 crc kubenswrapper[4699]: I1122 04:43:08.900747 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/649436e7-cf4f-4855-836a-a688d0f89f93-utilities\") pod \"community-operators-wnt4l\" (UID: \"649436e7-cf4f-4855-836a-a688d0f89f93\") " pod="openshift-marketplace/community-operators-wnt4l" Nov 22 04:43:08 crc kubenswrapper[4699]: I1122 04:43:08.921842 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bp97\" (UniqueName: \"kubernetes.io/projected/649436e7-cf4f-4855-836a-a688d0f89f93-kube-api-access-4bp97\") pod \"community-operators-wnt4l\" (UID: \"649436e7-cf4f-4855-836a-a688d0f89f93\") " pod="openshift-marketplace/community-operators-wnt4l" Nov 22 04:43:09 crc kubenswrapper[4699]: I1122 04:43:09.004268 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wnt4l" Nov 22 04:43:09 crc kubenswrapper[4699]: W1122 04:43:09.576563 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod649436e7_cf4f_4855_836a_a688d0f89f93.slice/crio-fdaaa0d8fa460b6d1ed8d1961a066e501cc871f898f75e68a06d4bfe180e6e16 WatchSource:0}: Error finding container fdaaa0d8fa460b6d1ed8d1961a066e501cc871f898f75e68a06d4bfe180e6e16: Status 404 returned error can't find the container with id fdaaa0d8fa460b6d1ed8d1961a066e501cc871f898f75e68a06d4bfe180e6e16 Nov 22 04:43:09 crc kubenswrapper[4699]: I1122 04:43:09.585285 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wnt4l"] Nov 22 04:43:09 crc kubenswrapper[4699]: I1122 04:43:09.787311 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wnt4l" event={"ID":"649436e7-cf4f-4855-836a-a688d0f89f93","Type":"ContainerStarted","Data":"fdaaa0d8fa460b6d1ed8d1961a066e501cc871f898f75e68a06d4bfe180e6e16"} Nov 22 04:43:10 crc kubenswrapper[4699]: I1122 04:43:10.798407 4699 generic.go:334] "Generic (PLEG): container finished" podID="649436e7-cf4f-4855-836a-a688d0f89f93" containerID="fda6a9326bbb844da8e5af3cdc31f8c426ee6b5bd0eb09111283329c5069bb21" exitCode=0 Nov 22 04:43:10 crc kubenswrapper[4699]: I1122 04:43:10.798544 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wnt4l" event={"ID":"649436e7-cf4f-4855-836a-a688d0f89f93","Type":"ContainerDied","Data":"fda6a9326bbb844da8e5af3cdc31f8c426ee6b5bd0eb09111283329c5069bb21"} Nov 22 04:43:10 crc kubenswrapper[4699]: I1122 04:43:10.801859 4699 generic.go:334] "Generic (PLEG): container finished" podID="72a71f7c-1bcf-4d02-98f6-ddef2b10672a" containerID="1963db3f99b736d271147ba05aa46938429f17d4bec29420864230a9431da532" exitCode=0 Nov 22 04:43:10 crc kubenswrapper[4699]: I1122 04:43:10.801921 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f6sqt" event={"ID":"72a71f7c-1bcf-4d02-98f6-ddef2b10672a","Type":"ContainerDied","Data":"1963db3f99b736d271147ba05aa46938429f17d4bec29420864230a9431da532"} Nov 22 04:43:13 crc kubenswrapper[4699]: I1122 04:43:13.851899 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f6sqt" event={"ID":"72a71f7c-1bcf-4d02-98f6-ddef2b10672a","Type":"ContainerStarted","Data":"81f15bacf45f04a75bdfb184d2d62ff9466966f86a24143abcd2109f8eb746c7"} Nov 22 04:43:13 crc kubenswrapper[4699]: I1122 04:43:13.884315 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-f6sqt" podStartSLOduration=4.154918851 podStartE2EDuration="7.884296914s" podCreationTimestamp="2025-11-22 04:43:06 +0000 UTC" firstStartedPulling="2025-11-22 04:43:08.780768108 +0000 UTC m=+2140.123389295" lastFinishedPulling="2025-11-22 04:43:12.510146161 +0000 UTC m=+2143.852767358" observedRunningTime="2025-11-22 04:43:13.87962451 +0000 UTC m=+2145.222245727" watchObservedRunningTime="2025-11-22 04:43:13.884296914 +0000 UTC m=+2145.226918101" Nov 22 04:43:14 crc kubenswrapper[4699]: I1122 04:43:14.865918 4699 generic.go:334] "Generic (PLEG): container finished" podID="649436e7-cf4f-4855-836a-a688d0f89f93" containerID="4462b31f8a6a1a71d201a431365459e345344ec54c1be02e296d8be92a777b9c" exitCode=0 Nov 22 04:43:14 crc kubenswrapper[4699]: I1122 04:43:14.866052 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wnt4l" event={"ID":"649436e7-cf4f-4855-836a-a688d0f89f93","Type":"ContainerDied","Data":"4462b31f8a6a1a71d201a431365459e345344ec54c1be02e296d8be92a777b9c"} Nov 22 04:43:16 crc kubenswrapper[4699]: I1122 04:43:16.630845 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-f6sqt" Nov 22 04:43:16 crc kubenswrapper[4699]: I1122 04:43:16.631167 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-f6sqt" Nov 22 04:43:16 crc kubenswrapper[4699]: I1122 04:43:16.715088 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-f6sqt" Nov 22 04:43:16 crc kubenswrapper[4699]: I1122 04:43:16.897606 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wnt4l" event={"ID":"649436e7-cf4f-4855-836a-a688d0f89f93","Type":"ContainerStarted","Data":"ecdfe48515508da3f557c6153f5f5c3477e2657c8db724ac77bfdef043a4c955"} Nov 22 04:43:16 crc kubenswrapper[4699]: I1122 04:43:16.921158 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wnt4l" podStartSLOduration=4.026901318 podStartE2EDuration="8.921142467s" podCreationTimestamp="2025-11-22 04:43:08 +0000 UTC" firstStartedPulling="2025-11-22 04:43:10.800979604 +0000 UTC m=+2142.143600791" lastFinishedPulling="2025-11-22 04:43:15.695220753 +0000 UTC m=+2147.037841940" observedRunningTime="2025-11-22 04:43:16.919940307 +0000 UTC m=+2148.262561504" watchObservedRunningTime="2025-11-22 04:43:16.921142467 +0000 UTC m=+2148.263763654" Nov 22 04:43:19 crc kubenswrapper[4699]: I1122 04:43:19.004422 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wnt4l" Nov 22 04:43:19 crc kubenswrapper[4699]: I1122 04:43:19.005032 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wnt4l" Nov 22 04:43:19 crc kubenswrapper[4699]: I1122 04:43:19.064997 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wnt4l" Nov 22 04:43:21 crc kubenswrapper[4699]: I1122 04:43:21.940897 4699 generic.go:334] "Generic (PLEG): container finished" podID="79e77b1e-0b23-42b6-a491-d15ace6ebcac" containerID="3b0dc36a0accdba80432fb988e774c8b9be7f4fcb4eb2bd16123db8846a03c3a" exitCode=0 Nov 22 04:43:21 crc kubenswrapper[4699]: I1122 04:43:21.941107 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-n6pxx/must-gather-6wzzd" event={"ID":"79e77b1e-0b23-42b6-a491-d15ace6ebcac","Type":"ContainerDied","Data":"3b0dc36a0accdba80432fb988e774c8b9be7f4fcb4eb2bd16123db8846a03c3a"} Nov 22 04:43:21 crc kubenswrapper[4699]: I1122 04:43:21.941791 4699 scope.go:117] "RemoveContainer" containerID="3b0dc36a0accdba80432fb988e774c8b9be7f4fcb4eb2bd16123db8846a03c3a" Nov 22 04:43:22 crc kubenswrapper[4699]: I1122 04:43:22.504922 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-n6pxx_must-gather-6wzzd_79e77b1e-0b23-42b6-a491-d15ace6ebcac/gather/0.log" Nov 22 04:43:26 crc kubenswrapper[4699]: I1122 04:43:26.694996 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-f6sqt" Nov 22 04:43:26 crc kubenswrapper[4699]: I1122 04:43:26.791882 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f6sqt"] Nov 22 04:43:27 crc kubenswrapper[4699]: I1122 04:43:27.177925 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-f6sqt" podUID="72a71f7c-1bcf-4d02-98f6-ddef2b10672a" containerName="registry-server" containerID="cri-o://81f15bacf45f04a75bdfb184d2d62ff9466966f86a24143abcd2109f8eb746c7" gracePeriod=2 Nov 22 04:43:27 crc kubenswrapper[4699]: I1122 04:43:27.675017 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f6sqt" Nov 22 04:43:27 crc kubenswrapper[4699]: I1122 04:43:27.691361 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72a71f7c-1bcf-4d02-98f6-ddef2b10672a-catalog-content\") pod \"72a71f7c-1bcf-4d02-98f6-ddef2b10672a\" (UID: \"72a71f7c-1bcf-4d02-98f6-ddef2b10672a\") " Nov 22 04:43:27 crc kubenswrapper[4699]: I1122 04:43:27.691475 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klcdc\" (UniqueName: \"kubernetes.io/projected/72a71f7c-1bcf-4d02-98f6-ddef2b10672a-kube-api-access-klcdc\") pod \"72a71f7c-1bcf-4d02-98f6-ddef2b10672a\" (UID: \"72a71f7c-1bcf-4d02-98f6-ddef2b10672a\") " Nov 22 04:43:27 crc kubenswrapper[4699]: I1122 04:43:27.701278 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72a71f7c-1bcf-4d02-98f6-ddef2b10672a-kube-api-access-klcdc" (OuterVolumeSpecName: "kube-api-access-klcdc") pod "72a71f7c-1bcf-4d02-98f6-ddef2b10672a" (UID: "72a71f7c-1bcf-4d02-98f6-ddef2b10672a"). InnerVolumeSpecName "kube-api-access-klcdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:43:27 crc kubenswrapper[4699]: I1122 04:43:27.732727 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72a71f7c-1bcf-4d02-98f6-ddef2b10672a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "72a71f7c-1bcf-4d02-98f6-ddef2b10672a" (UID: "72a71f7c-1bcf-4d02-98f6-ddef2b10672a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:43:27 crc kubenswrapper[4699]: I1122 04:43:27.793355 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72a71f7c-1bcf-4d02-98f6-ddef2b10672a-utilities\") pod \"72a71f7c-1bcf-4d02-98f6-ddef2b10672a\" (UID: \"72a71f7c-1bcf-4d02-98f6-ddef2b10672a\") " Nov 22 04:43:27 crc kubenswrapper[4699]: I1122 04:43:27.794292 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72a71f7c-1bcf-4d02-98f6-ddef2b10672a-utilities" (OuterVolumeSpecName: "utilities") pod "72a71f7c-1bcf-4d02-98f6-ddef2b10672a" (UID: "72a71f7c-1bcf-4d02-98f6-ddef2b10672a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:43:27 crc kubenswrapper[4699]: I1122 04:43:27.795099 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klcdc\" (UniqueName: \"kubernetes.io/projected/72a71f7c-1bcf-4d02-98f6-ddef2b10672a-kube-api-access-klcdc\") on node \"crc\" DevicePath \"\"" Nov 22 04:43:27 crc kubenswrapper[4699]: I1122 04:43:27.795242 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72a71f7c-1bcf-4d02-98f6-ddef2b10672a-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:43:27 crc kubenswrapper[4699]: I1122 04:43:27.795336 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72a71f7c-1bcf-4d02-98f6-ddef2b10672a-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:43:28 crc kubenswrapper[4699]: I1122 04:43:28.192690 4699 generic.go:334] "Generic (PLEG): container finished" podID="72a71f7c-1bcf-4d02-98f6-ddef2b10672a" containerID="81f15bacf45f04a75bdfb184d2d62ff9466966f86a24143abcd2109f8eb746c7" exitCode=0 Nov 22 04:43:28 crc kubenswrapper[4699]: I1122 04:43:28.192753 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f6sqt" event={"ID":"72a71f7c-1bcf-4d02-98f6-ddef2b10672a","Type":"ContainerDied","Data":"81f15bacf45f04a75bdfb184d2d62ff9466966f86a24143abcd2109f8eb746c7"} Nov 22 04:43:28 crc kubenswrapper[4699]: I1122 04:43:28.192803 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f6sqt" event={"ID":"72a71f7c-1bcf-4d02-98f6-ddef2b10672a","Type":"ContainerDied","Data":"6f2e71bc25876b40be1bc8ca1fdda4e695fe331114aa841de83875f0a38738cb"} Nov 22 04:43:28 crc kubenswrapper[4699]: I1122 04:43:28.192801 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f6sqt" Nov 22 04:43:28 crc kubenswrapper[4699]: I1122 04:43:28.192825 4699 scope.go:117] "RemoveContainer" containerID="81f15bacf45f04a75bdfb184d2d62ff9466966f86a24143abcd2109f8eb746c7" Nov 22 04:43:28 crc kubenswrapper[4699]: I1122 04:43:28.222028 4699 scope.go:117] "RemoveContainer" containerID="1963db3f99b736d271147ba05aa46938429f17d4bec29420864230a9431da532" Nov 22 04:43:28 crc kubenswrapper[4699]: I1122 04:43:28.246356 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f6sqt"] Nov 22 04:43:28 crc kubenswrapper[4699]: I1122 04:43:28.254300 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-f6sqt"] Nov 22 04:43:28 crc kubenswrapper[4699]: I1122 04:43:28.280550 4699 scope.go:117] "RemoveContainer" containerID="2fe4ace897b5f2d1a71767665a19290e733d5b4e837efd7973a5d62d919c252d" Nov 22 04:43:28 crc kubenswrapper[4699]: I1122 04:43:28.328820 4699 scope.go:117] "RemoveContainer" containerID="81f15bacf45f04a75bdfb184d2d62ff9466966f86a24143abcd2109f8eb746c7" Nov 22 04:43:28 crc kubenswrapper[4699]: E1122 04:43:28.329319 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81f15bacf45f04a75bdfb184d2d62ff9466966f86a24143abcd2109f8eb746c7\": container with ID starting with 81f15bacf45f04a75bdfb184d2d62ff9466966f86a24143abcd2109f8eb746c7 not found: ID does not exist" containerID="81f15bacf45f04a75bdfb184d2d62ff9466966f86a24143abcd2109f8eb746c7" Nov 22 04:43:28 crc kubenswrapper[4699]: I1122 04:43:28.329361 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81f15bacf45f04a75bdfb184d2d62ff9466966f86a24143abcd2109f8eb746c7"} err="failed to get container status \"81f15bacf45f04a75bdfb184d2d62ff9466966f86a24143abcd2109f8eb746c7\": rpc error: code = NotFound desc = could not find container \"81f15bacf45f04a75bdfb184d2d62ff9466966f86a24143abcd2109f8eb746c7\": container with ID starting with 81f15bacf45f04a75bdfb184d2d62ff9466966f86a24143abcd2109f8eb746c7 not found: ID does not exist" Nov 22 04:43:28 crc kubenswrapper[4699]: I1122 04:43:28.329386 4699 scope.go:117] "RemoveContainer" containerID="1963db3f99b736d271147ba05aa46938429f17d4bec29420864230a9431da532" Nov 22 04:43:28 crc kubenswrapper[4699]: E1122 04:43:28.329724 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1963db3f99b736d271147ba05aa46938429f17d4bec29420864230a9431da532\": container with ID starting with 1963db3f99b736d271147ba05aa46938429f17d4bec29420864230a9431da532 not found: ID does not exist" containerID="1963db3f99b736d271147ba05aa46938429f17d4bec29420864230a9431da532" Nov 22 04:43:28 crc kubenswrapper[4699]: I1122 04:43:28.329763 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1963db3f99b736d271147ba05aa46938429f17d4bec29420864230a9431da532"} err="failed to get container status \"1963db3f99b736d271147ba05aa46938429f17d4bec29420864230a9431da532\": rpc error: code = NotFound desc = could not find container \"1963db3f99b736d271147ba05aa46938429f17d4bec29420864230a9431da532\": container with ID starting with 1963db3f99b736d271147ba05aa46938429f17d4bec29420864230a9431da532 not found: ID does not exist" Nov 22 04:43:28 crc kubenswrapper[4699]: I1122 04:43:28.329792 4699 scope.go:117] "RemoveContainer" containerID="2fe4ace897b5f2d1a71767665a19290e733d5b4e837efd7973a5d62d919c252d" Nov 22 04:43:28 crc kubenswrapper[4699]: E1122 04:43:28.330122 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fe4ace897b5f2d1a71767665a19290e733d5b4e837efd7973a5d62d919c252d\": container with ID starting with 2fe4ace897b5f2d1a71767665a19290e733d5b4e837efd7973a5d62d919c252d not found: ID does not exist" containerID="2fe4ace897b5f2d1a71767665a19290e733d5b4e837efd7973a5d62d919c252d" Nov 22 04:43:28 crc kubenswrapper[4699]: I1122 04:43:28.330163 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fe4ace897b5f2d1a71767665a19290e733d5b4e837efd7973a5d62d919c252d"} err="failed to get container status \"2fe4ace897b5f2d1a71767665a19290e733d5b4e837efd7973a5d62d919c252d\": rpc error: code = NotFound desc = could not find container \"2fe4ace897b5f2d1a71767665a19290e733d5b4e837efd7973a5d62d919c252d\": container with ID starting with 2fe4ace897b5f2d1a71767665a19290e733d5b4e837efd7973a5d62d919c252d not found: ID does not exist" Nov 22 04:43:29 crc kubenswrapper[4699]: I1122 04:43:29.067316 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wnt4l" Nov 22 04:43:29 crc kubenswrapper[4699]: I1122 04:43:29.460102 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72a71f7c-1bcf-4d02-98f6-ddef2b10672a" path="/var/lib/kubelet/pods/72a71f7c-1bcf-4d02-98f6-ddef2b10672a/volumes" Nov 22 04:43:31 crc kubenswrapper[4699]: I1122 04:43:31.350473 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wnt4l"] Nov 22 04:43:31 crc kubenswrapper[4699]: I1122 04:43:31.351105 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wnt4l" podUID="649436e7-cf4f-4855-836a-a688d0f89f93" containerName="registry-server" containerID="cri-o://ecdfe48515508da3f557c6153f5f5c3477e2657c8db724ac77bfdef043a4c955" gracePeriod=2 Nov 22 04:43:32 crc kubenswrapper[4699]: I1122 04:43:32.237610 4699 generic.go:334] "Generic (PLEG): container finished" podID="649436e7-cf4f-4855-836a-a688d0f89f93" containerID="ecdfe48515508da3f557c6153f5f5c3477e2657c8db724ac77bfdef043a4c955" exitCode=0 Nov 22 04:43:32 crc kubenswrapper[4699]: I1122 04:43:32.237754 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wnt4l" event={"ID":"649436e7-cf4f-4855-836a-a688d0f89f93","Type":"ContainerDied","Data":"ecdfe48515508da3f557c6153f5f5c3477e2657c8db724ac77bfdef043a4c955"} Nov 22 04:43:32 crc kubenswrapper[4699]: I1122 04:43:32.481277 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-n6pxx/must-gather-6wzzd"] Nov 22 04:43:32 crc kubenswrapper[4699]: I1122 04:43:32.482138 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-n6pxx/must-gather-6wzzd" podUID="79e77b1e-0b23-42b6-a491-d15ace6ebcac" containerName="copy" containerID="cri-o://330bb1cd56cb9b03b1b4e88715b41bb9dcbd269e4a31a861b07f49d7f90dfb60" gracePeriod=2 Nov 22 04:43:32 crc kubenswrapper[4699]: I1122 04:43:32.493247 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-n6pxx/must-gather-6wzzd"] Nov 22 04:43:32 crc kubenswrapper[4699]: I1122 04:43:32.593242 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wnt4l" Nov 22 04:43:32 crc kubenswrapper[4699]: I1122 04:43:32.688456 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/649436e7-cf4f-4855-836a-a688d0f89f93-catalog-content\") pod \"649436e7-cf4f-4855-836a-a688d0f89f93\" (UID: \"649436e7-cf4f-4855-836a-a688d0f89f93\") " Nov 22 04:43:32 crc kubenswrapper[4699]: I1122 04:43:32.688646 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/649436e7-cf4f-4855-836a-a688d0f89f93-utilities\") pod \"649436e7-cf4f-4855-836a-a688d0f89f93\" (UID: \"649436e7-cf4f-4855-836a-a688d0f89f93\") " Nov 22 04:43:32 crc kubenswrapper[4699]: I1122 04:43:32.688695 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bp97\" (UniqueName: \"kubernetes.io/projected/649436e7-cf4f-4855-836a-a688d0f89f93-kube-api-access-4bp97\") pod \"649436e7-cf4f-4855-836a-a688d0f89f93\" (UID: \"649436e7-cf4f-4855-836a-a688d0f89f93\") " Nov 22 04:43:32 crc kubenswrapper[4699]: I1122 04:43:32.689476 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/649436e7-cf4f-4855-836a-a688d0f89f93-utilities" (OuterVolumeSpecName: "utilities") pod "649436e7-cf4f-4855-836a-a688d0f89f93" (UID: "649436e7-cf4f-4855-836a-a688d0f89f93"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:43:32 crc kubenswrapper[4699]: I1122 04:43:32.698347 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/649436e7-cf4f-4855-836a-a688d0f89f93-kube-api-access-4bp97" (OuterVolumeSpecName: "kube-api-access-4bp97") pod "649436e7-cf4f-4855-836a-a688d0f89f93" (UID: "649436e7-cf4f-4855-836a-a688d0f89f93"). InnerVolumeSpecName "kube-api-access-4bp97". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:43:32 crc kubenswrapper[4699]: E1122 04:43:32.715392 4699 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79e77b1e_0b23_42b6_a491_d15ace6ebcac.slice/crio-330bb1cd56cb9b03b1b4e88715b41bb9dcbd269e4a31a861b07f49d7f90dfb60.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79e77b1e_0b23_42b6_a491_d15ace6ebcac.slice/crio-conmon-330bb1cd56cb9b03b1b4e88715b41bb9dcbd269e4a31a861b07f49d7f90dfb60.scope\": RecentStats: unable to find data in memory cache]" Nov 22 04:43:32 crc kubenswrapper[4699]: I1122 04:43:32.768106 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/649436e7-cf4f-4855-836a-a688d0f89f93-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "649436e7-cf4f-4855-836a-a688d0f89f93" (UID: "649436e7-cf4f-4855-836a-a688d0f89f93"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:43:32 crc kubenswrapper[4699]: I1122 04:43:32.790659 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/649436e7-cf4f-4855-836a-a688d0f89f93-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:43:32 crc kubenswrapper[4699]: I1122 04:43:32.790993 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/649436e7-cf4f-4855-836a-a688d0f89f93-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:43:32 crc kubenswrapper[4699]: I1122 04:43:32.791005 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bp97\" (UniqueName: \"kubernetes.io/projected/649436e7-cf4f-4855-836a-a688d0f89f93-kube-api-access-4bp97\") on node \"crc\" DevicePath \"\"" Nov 22 04:43:33 crc kubenswrapper[4699]: I1122 04:43:33.248290 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-n6pxx_must-gather-6wzzd_79e77b1e-0b23-42b6-a491-d15ace6ebcac/copy/0.log" Nov 22 04:43:33 crc kubenswrapper[4699]: I1122 04:43:33.248717 4699 generic.go:334] "Generic (PLEG): container finished" podID="79e77b1e-0b23-42b6-a491-d15ace6ebcac" containerID="330bb1cd56cb9b03b1b4e88715b41bb9dcbd269e4a31a861b07f49d7f90dfb60" exitCode=143 Nov 22 04:43:33 crc kubenswrapper[4699]: I1122 04:43:33.252054 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wnt4l" event={"ID":"649436e7-cf4f-4855-836a-a688d0f89f93","Type":"ContainerDied","Data":"fdaaa0d8fa460b6d1ed8d1961a066e501cc871f898f75e68a06d4bfe180e6e16"} Nov 22 04:43:33 crc kubenswrapper[4699]: I1122 04:43:33.252117 4699 scope.go:117] "RemoveContainer" containerID="ecdfe48515508da3f557c6153f5f5c3477e2657c8db724ac77bfdef043a4c955" Nov 22 04:43:33 crc kubenswrapper[4699]: I1122 04:43:33.252169 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wnt4l" Nov 22 04:43:33 crc kubenswrapper[4699]: I1122 04:43:33.284148 4699 scope.go:117] "RemoveContainer" containerID="4462b31f8a6a1a71d201a431365459e345344ec54c1be02e296d8be92a777b9c" Nov 22 04:43:33 crc kubenswrapper[4699]: I1122 04:43:33.309726 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wnt4l"] Nov 22 04:43:33 crc kubenswrapper[4699]: I1122 04:43:33.323357 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wnt4l"] Nov 22 04:43:33 crc kubenswrapper[4699]: I1122 04:43:33.355158 4699 scope.go:117] "RemoveContainer" containerID="fda6a9326bbb844da8e5af3cdc31f8c426ee6b5bd0eb09111283329c5069bb21" Nov 22 04:43:33 crc kubenswrapper[4699]: I1122 04:43:33.460324 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="649436e7-cf4f-4855-836a-a688d0f89f93" path="/var/lib/kubelet/pods/649436e7-cf4f-4855-836a-a688d0f89f93/volumes" Nov 22 04:43:33 crc kubenswrapper[4699]: I1122 04:43:33.823358 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-n6pxx_must-gather-6wzzd_79e77b1e-0b23-42b6-a491-d15ace6ebcac/copy/0.log" Nov 22 04:43:33 crc kubenswrapper[4699]: I1122 04:43:33.824140 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n6pxx/must-gather-6wzzd" Nov 22 04:43:33 crc kubenswrapper[4699]: I1122 04:43:33.914381 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n28hg\" (UniqueName: \"kubernetes.io/projected/79e77b1e-0b23-42b6-a491-d15ace6ebcac-kube-api-access-n28hg\") pod \"79e77b1e-0b23-42b6-a491-d15ace6ebcac\" (UID: \"79e77b1e-0b23-42b6-a491-d15ace6ebcac\") " Nov 22 04:43:33 crc kubenswrapper[4699]: I1122 04:43:33.914519 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/79e77b1e-0b23-42b6-a491-d15ace6ebcac-must-gather-output\") pod \"79e77b1e-0b23-42b6-a491-d15ace6ebcac\" (UID: \"79e77b1e-0b23-42b6-a491-d15ace6ebcac\") " Nov 22 04:43:33 crc kubenswrapper[4699]: I1122 04:43:33.927582 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79e77b1e-0b23-42b6-a491-d15ace6ebcac-kube-api-access-n28hg" (OuterVolumeSpecName: "kube-api-access-n28hg") pod "79e77b1e-0b23-42b6-a491-d15ace6ebcac" (UID: "79e77b1e-0b23-42b6-a491-d15ace6ebcac"). InnerVolumeSpecName "kube-api-access-n28hg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:43:34 crc kubenswrapper[4699]: I1122 04:43:34.021744 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n28hg\" (UniqueName: \"kubernetes.io/projected/79e77b1e-0b23-42b6-a491-d15ace6ebcac-kube-api-access-n28hg\") on node \"crc\" DevicePath \"\"" Nov 22 04:43:34 crc kubenswrapper[4699]: I1122 04:43:34.061812 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79e77b1e-0b23-42b6-a491-d15ace6ebcac-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "79e77b1e-0b23-42b6-a491-d15ace6ebcac" (UID: "79e77b1e-0b23-42b6-a491-d15ace6ebcac"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:43:34 crc kubenswrapper[4699]: I1122 04:43:34.124481 4699 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/79e77b1e-0b23-42b6-a491-d15ace6ebcac-must-gather-output\") on node \"crc\" DevicePath \"\"" Nov 22 04:43:34 crc kubenswrapper[4699]: I1122 04:43:34.264936 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-n6pxx_must-gather-6wzzd_79e77b1e-0b23-42b6-a491-d15ace6ebcac/copy/0.log" Nov 22 04:43:34 crc kubenswrapper[4699]: I1122 04:43:34.265558 4699 scope.go:117] "RemoveContainer" containerID="330bb1cd56cb9b03b1b4e88715b41bb9dcbd269e4a31a861b07f49d7f90dfb60" Nov 22 04:43:34 crc kubenswrapper[4699]: I1122 04:43:34.265665 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-n6pxx/must-gather-6wzzd" Nov 22 04:43:34 crc kubenswrapper[4699]: I1122 04:43:34.302795 4699 scope.go:117] "RemoveContainer" containerID="3b0dc36a0accdba80432fb988e774c8b9be7f4fcb4eb2bd16123db8846a03c3a" Nov 22 04:43:35 crc kubenswrapper[4699]: I1122 04:43:35.461462 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79e77b1e-0b23-42b6-a491-d15ace6ebcac" path="/var/lib/kubelet/pods/79e77b1e-0b23-42b6-a491-d15ace6ebcac/volumes" Nov 22 04:43:38 crc kubenswrapper[4699]: I1122 04:43:38.725987 4699 patch_prober.go:28] interesting pod/machine-config-daemon-kjwnt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:43:38 crc kubenswrapper[4699]: I1122 04:43:38.727651 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:44:08 crc kubenswrapper[4699]: I1122 04:44:08.726785 4699 patch_prober.go:28] interesting pod/machine-config-daemon-kjwnt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:44:08 crc kubenswrapper[4699]: I1122 04:44:08.727500 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:44:08 crc kubenswrapper[4699]: I1122 04:44:08.727567 4699 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" Nov 22 04:44:08 crc kubenswrapper[4699]: I1122 04:44:08.728785 4699 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3c295cef383260f09250dce73dcd78e9753722e5852bd1dba0e9d0043c5a2324"} pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 04:44:08 crc kubenswrapper[4699]: I1122 04:44:08.728886 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerName="machine-config-daemon" containerID="cri-o://3c295cef383260f09250dce73dcd78e9753722e5852bd1dba0e9d0043c5a2324" gracePeriod=600 Nov 22 04:44:09 crc kubenswrapper[4699]: I1122 04:44:09.605332 4699 generic.go:334] "Generic (PLEG): container finished" podID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerID="3c295cef383260f09250dce73dcd78e9753722e5852bd1dba0e9d0043c5a2324" exitCode=0 Nov 22 04:44:09 crc kubenswrapper[4699]: I1122 04:44:09.605374 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" event={"ID":"41bdbae2-706a-4f84-9f56-5a42aec77762","Type":"ContainerDied","Data":"3c295cef383260f09250dce73dcd78e9753722e5852bd1dba0e9d0043c5a2324"} Nov 22 04:44:09 crc kubenswrapper[4699]: I1122 04:44:09.606010 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" event={"ID":"41bdbae2-706a-4f84-9f56-5a42aec77762","Type":"ContainerStarted","Data":"09c1fed9b7151bf7233b81b1d357689f9d97efd829231216c50a7352fe56ab7b"} Nov 22 04:44:09 crc kubenswrapper[4699]: I1122 04:44:09.606088 4699 scope.go:117] "RemoveContainer" containerID="9f6c8a2daef4dc5617a6b47fc5d58598238dea049bba2ad09b65bd85f946e581" Nov 22 04:44:44 crc kubenswrapper[4699]: I1122 04:44:44.230918 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6xdwr"] Nov 22 04:44:44 crc kubenswrapper[4699]: E1122 04:44:44.232584 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79e77b1e-0b23-42b6-a491-d15ace6ebcac" containerName="gather" Nov 22 04:44:44 crc kubenswrapper[4699]: I1122 04:44:44.232616 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="79e77b1e-0b23-42b6-a491-d15ace6ebcac" containerName="gather" Nov 22 04:44:44 crc kubenswrapper[4699]: E1122 04:44:44.232635 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a71f7c-1bcf-4d02-98f6-ddef2b10672a" containerName="registry-server" Nov 22 04:44:44 crc kubenswrapper[4699]: I1122 04:44:44.232647 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a71f7c-1bcf-4d02-98f6-ddef2b10672a" containerName="registry-server" Nov 22 04:44:44 crc kubenswrapper[4699]: E1122 04:44:44.232681 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a71f7c-1bcf-4d02-98f6-ddef2b10672a" containerName="extract-content" Nov 22 04:44:44 crc kubenswrapper[4699]: I1122 04:44:44.232693 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a71f7c-1bcf-4d02-98f6-ddef2b10672a" containerName="extract-content" Nov 22 04:44:44 crc kubenswrapper[4699]: E1122 04:44:44.232715 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="649436e7-cf4f-4855-836a-a688d0f89f93" containerName="extract-utilities" Nov 22 04:44:44 crc kubenswrapper[4699]: I1122 04:44:44.232726 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="649436e7-cf4f-4855-836a-a688d0f89f93" containerName="extract-utilities" Nov 22 04:44:44 crc kubenswrapper[4699]: E1122 04:44:44.232739 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="649436e7-cf4f-4855-836a-a688d0f89f93" containerName="registry-server" Nov 22 04:44:44 crc kubenswrapper[4699]: I1122 04:44:44.232750 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="649436e7-cf4f-4855-836a-a688d0f89f93" containerName="registry-server" Nov 22 04:44:44 crc kubenswrapper[4699]: E1122 04:44:44.232793 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79e77b1e-0b23-42b6-a491-d15ace6ebcac" containerName="copy" Nov 22 04:44:44 crc kubenswrapper[4699]: I1122 04:44:44.232805 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="79e77b1e-0b23-42b6-a491-d15ace6ebcac" containerName="copy" Nov 22 04:44:44 crc kubenswrapper[4699]: E1122 04:44:44.232827 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="649436e7-cf4f-4855-836a-a688d0f89f93" containerName="extract-content" Nov 22 04:44:44 crc kubenswrapper[4699]: I1122 04:44:44.232838 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="649436e7-cf4f-4855-836a-a688d0f89f93" containerName="extract-content" Nov 22 04:44:44 crc kubenswrapper[4699]: E1122 04:44:44.232869 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a71f7c-1bcf-4d02-98f6-ddef2b10672a" containerName="extract-utilities" Nov 22 04:44:44 crc kubenswrapper[4699]: I1122 04:44:44.232881 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a71f7c-1bcf-4d02-98f6-ddef2b10672a" containerName="extract-utilities" Nov 22 04:44:44 crc kubenswrapper[4699]: I1122 04:44:44.233237 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="649436e7-cf4f-4855-836a-a688d0f89f93" containerName="registry-server" Nov 22 04:44:44 crc kubenswrapper[4699]: I1122 04:44:44.233267 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="79e77b1e-0b23-42b6-a491-d15ace6ebcac" containerName="copy" Nov 22 04:44:44 crc kubenswrapper[4699]: I1122 04:44:44.233308 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="72a71f7c-1bcf-4d02-98f6-ddef2b10672a" containerName="registry-server" Nov 22 04:44:44 crc kubenswrapper[4699]: I1122 04:44:44.233328 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="79e77b1e-0b23-42b6-a491-d15ace6ebcac" containerName="gather" Nov 22 04:44:44 crc kubenswrapper[4699]: I1122 04:44:44.235974 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6xdwr" Nov 22 04:44:44 crc kubenswrapper[4699]: I1122 04:44:44.247097 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6xdwr"] Nov 22 04:44:44 crc kubenswrapper[4699]: I1122 04:44:44.391097 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ca12153-56c2-4ff7-af24-d20a1d85c90b-utilities\") pod \"certified-operators-6xdwr\" (UID: \"2ca12153-56c2-4ff7-af24-d20a1d85c90b\") " pod="openshift-marketplace/certified-operators-6xdwr" Nov 22 04:44:44 crc kubenswrapper[4699]: I1122 04:44:44.391298 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pb66\" (UniqueName: \"kubernetes.io/projected/2ca12153-56c2-4ff7-af24-d20a1d85c90b-kube-api-access-2pb66\") pod \"certified-operators-6xdwr\" (UID: \"2ca12153-56c2-4ff7-af24-d20a1d85c90b\") " pod="openshift-marketplace/certified-operators-6xdwr" Nov 22 04:44:44 crc kubenswrapper[4699]: I1122 04:44:44.391352 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ca12153-56c2-4ff7-af24-d20a1d85c90b-catalog-content\") pod \"certified-operators-6xdwr\" (UID: \"2ca12153-56c2-4ff7-af24-d20a1d85c90b\") " pod="openshift-marketplace/certified-operators-6xdwr" Nov 22 04:44:44 crc kubenswrapper[4699]: I1122 04:44:44.492909 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pb66\" (UniqueName: \"kubernetes.io/projected/2ca12153-56c2-4ff7-af24-d20a1d85c90b-kube-api-access-2pb66\") pod \"certified-operators-6xdwr\" (UID: \"2ca12153-56c2-4ff7-af24-d20a1d85c90b\") " pod="openshift-marketplace/certified-operators-6xdwr" Nov 22 04:44:44 crc kubenswrapper[4699]: I1122 04:44:44.493018 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ca12153-56c2-4ff7-af24-d20a1d85c90b-catalog-content\") pod \"certified-operators-6xdwr\" (UID: \"2ca12153-56c2-4ff7-af24-d20a1d85c90b\") " pod="openshift-marketplace/certified-operators-6xdwr" Nov 22 04:44:44 crc kubenswrapper[4699]: I1122 04:44:44.493055 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ca12153-56c2-4ff7-af24-d20a1d85c90b-utilities\") pod \"certified-operators-6xdwr\" (UID: \"2ca12153-56c2-4ff7-af24-d20a1d85c90b\") " pod="openshift-marketplace/certified-operators-6xdwr" Nov 22 04:44:44 crc kubenswrapper[4699]: I1122 04:44:44.493737 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ca12153-56c2-4ff7-af24-d20a1d85c90b-utilities\") pod \"certified-operators-6xdwr\" (UID: \"2ca12153-56c2-4ff7-af24-d20a1d85c90b\") " pod="openshift-marketplace/certified-operators-6xdwr" Nov 22 04:44:44 crc kubenswrapper[4699]: I1122 04:44:44.494344 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ca12153-56c2-4ff7-af24-d20a1d85c90b-catalog-content\") pod \"certified-operators-6xdwr\" (UID: \"2ca12153-56c2-4ff7-af24-d20a1d85c90b\") " pod="openshift-marketplace/certified-operators-6xdwr" Nov 22 04:44:44 crc kubenswrapper[4699]: I1122 04:44:44.527784 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pb66\" (UniqueName: \"kubernetes.io/projected/2ca12153-56c2-4ff7-af24-d20a1d85c90b-kube-api-access-2pb66\") pod \"certified-operators-6xdwr\" (UID: \"2ca12153-56c2-4ff7-af24-d20a1d85c90b\") " pod="openshift-marketplace/certified-operators-6xdwr" Nov 22 04:44:44 crc kubenswrapper[4699]: I1122 04:44:44.576480 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6xdwr" Nov 22 04:44:45 crc kubenswrapper[4699]: I1122 04:44:45.105841 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6xdwr"] Nov 22 04:44:45 crc kubenswrapper[4699]: I1122 04:44:45.968516 4699 generic.go:334] "Generic (PLEG): container finished" podID="2ca12153-56c2-4ff7-af24-d20a1d85c90b" containerID="449f1dac301f326daf8b131769cdceb2c62928f0ef98833369aa6d98736a161a" exitCode=0 Nov 22 04:44:45 crc kubenswrapper[4699]: I1122 04:44:45.968561 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6xdwr" event={"ID":"2ca12153-56c2-4ff7-af24-d20a1d85c90b","Type":"ContainerDied","Data":"449f1dac301f326daf8b131769cdceb2c62928f0ef98833369aa6d98736a161a"} Nov 22 04:44:45 crc kubenswrapper[4699]: I1122 04:44:45.968585 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6xdwr" event={"ID":"2ca12153-56c2-4ff7-af24-d20a1d85c90b","Type":"ContainerStarted","Data":"fc04dc36c808c493275c779ec75be9b56f5a7c4321504a183c3edc6ba2c1d2fc"} Nov 22 04:44:45 crc kubenswrapper[4699]: I1122 04:44:45.971675 4699 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 04:44:47 crc kubenswrapper[4699]: I1122 04:44:47.995238 4699 generic.go:334] "Generic (PLEG): container finished" podID="2ca12153-56c2-4ff7-af24-d20a1d85c90b" containerID="092a26dc54153ef541491c9650e817f940dba814ec759d25952d5f323c3d885a" exitCode=0 Nov 22 04:44:47 crc kubenswrapper[4699]: I1122 04:44:47.995761 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6xdwr" event={"ID":"2ca12153-56c2-4ff7-af24-d20a1d85c90b","Type":"ContainerDied","Data":"092a26dc54153ef541491c9650e817f940dba814ec759d25952d5f323c3d885a"} Nov 22 04:44:49 crc kubenswrapper[4699]: I1122 04:44:49.011788 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6xdwr" event={"ID":"2ca12153-56c2-4ff7-af24-d20a1d85c90b","Type":"ContainerStarted","Data":"a29e5b6eb920ab281c89ef63d94ad0adbaaddffeccc77a20f117c3acd5fb0626"} Nov 22 04:44:49 crc kubenswrapper[4699]: I1122 04:44:49.036792 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6xdwr" podStartSLOduration=2.635106303 podStartE2EDuration="5.03677014s" podCreationTimestamp="2025-11-22 04:44:44 +0000 UTC" firstStartedPulling="2025-11-22 04:44:45.971430679 +0000 UTC m=+2237.314051856" lastFinishedPulling="2025-11-22 04:44:48.373094496 +0000 UTC m=+2239.715715693" observedRunningTime="2025-11-22 04:44:49.030312411 +0000 UTC m=+2240.372933618" watchObservedRunningTime="2025-11-22 04:44:49.03677014 +0000 UTC m=+2240.379391337" Nov 22 04:44:54 crc kubenswrapper[4699]: I1122 04:44:54.577580 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6xdwr" Nov 22 04:44:54 crc kubenswrapper[4699]: I1122 04:44:54.578176 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6xdwr" Nov 22 04:44:54 crc kubenswrapper[4699]: I1122 04:44:54.661723 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6xdwr" Nov 22 04:44:55 crc kubenswrapper[4699]: I1122 04:44:55.142911 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6xdwr" Nov 22 04:44:55 crc kubenswrapper[4699]: I1122 04:44:55.199151 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6xdwr"] Nov 22 04:44:57 crc kubenswrapper[4699]: I1122 04:44:57.084594 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6xdwr" podUID="2ca12153-56c2-4ff7-af24-d20a1d85c90b" containerName="registry-server" containerID="cri-o://a29e5b6eb920ab281c89ef63d94ad0adbaaddffeccc77a20f117c3acd5fb0626" gracePeriod=2 Nov 22 04:44:58 crc kubenswrapper[4699]: I1122 04:44:58.097737 4699 generic.go:334] "Generic (PLEG): container finished" podID="2ca12153-56c2-4ff7-af24-d20a1d85c90b" containerID="a29e5b6eb920ab281c89ef63d94ad0adbaaddffeccc77a20f117c3acd5fb0626" exitCode=0 Nov 22 04:44:58 crc kubenswrapper[4699]: I1122 04:44:58.098005 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6xdwr" event={"ID":"2ca12153-56c2-4ff7-af24-d20a1d85c90b","Type":"ContainerDied","Data":"a29e5b6eb920ab281c89ef63d94ad0adbaaddffeccc77a20f117c3acd5fb0626"} Nov 22 04:44:58 crc kubenswrapper[4699]: I1122 04:44:58.098884 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6xdwr" event={"ID":"2ca12153-56c2-4ff7-af24-d20a1d85c90b","Type":"ContainerDied","Data":"fc04dc36c808c493275c779ec75be9b56f5a7c4321504a183c3edc6ba2c1d2fc"} Nov 22 04:44:58 crc kubenswrapper[4699]: I1122 04:44:58.098911 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc04dc36c808c493275c779ec75be9b56f5a7c4321504a183c3edc6ba2c1d2fc" Nov 22 04:44:58 crc kubenswrapper[4699]: I1122 04:44:58.123046 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6xdwr" Nov 22 04:44:58 crc kubenswrapper[4699]: I1122 04:44:58.281414 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ca12153-56c2-4ff7-af24-d20a1d85c90b-utilities\") pod \"2ca12153-56c2-4ff7-af24-d20a1d85c90b\" (UID: \"2ca12153-56c2-4ff7-af24-d20a1d85c90b\") " Nov 22 04:44:58 crc kubenswrapper[4699]: I1122 04:44:58.281494 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pb66\" (UniqueName: \"kubernetes.io/projected/2ca12153-56c2-4ff7-af24-d20a1d85c90b-kube-api-access-2pb66\") pod \"2ca12153-56c2-4ff7-af24-d20a1d85c90b\" (UID: \"2ca12153-56c2-4ff7-af24-d20a1d85c90b\") " Nov 22 04:44:58 crc kubenswrapper[4699]: I1122 04:44:58.281650 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ca12153-56c2-4ff7-af24-d20a1d85c90b-catalog-content\") pod \"2ca12153-56c2-4ff7-af24-d20a1d85c90b\" (UID: \"2ca12153-56c2-4ff7-af24-d20a1d85c90b\") " Nov 22 04:44:58 crc kubenswrapper[4699]: I1122 04:44:58.283707 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ca12153-56c2-4ff7-af24-d20a1d85c90b-utilities" (OuterVolumeSpecName: "utilities") pod "2ca12153-56c2-4ff7-af24-d20a1d85c90b" (UID: "2ca12153-56c2-4ff7-af24-d20a1d85c90b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:44:58 crc kubenswrapper[4699]: I1122 04:44:58.301691 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ca12153-56c2-4ff7-af24-d20a1d85c90b-kube-api-access-2pb66" (OuterVolumeSpecName: "kube-api-access-2pb66") pod "2ca12153-56c2-4ff7-af24-d20a1d85c90b" (UID: "2ca12153-56c2-4ff7-af24-d20a1d85c90b"). InnerVolumeSpecName "kube-api-access-2pb66". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:44:58 crc kubenswrapper[4699]: I1122 04:44:58.340946 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ca12153-56c2-4ff7-af24-d20a1d85c90b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2ca12153-56c2-4ff7-af24-d20a1d85c90b" (UID: "2ca12153-56c2-4ff7-af24-d20a1d85c90b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:44:58 crc kubenswrapper[4699]: I1122 04:44:58.383524 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ca12153-56c2-4ff7-af24-d20a1d85c90b-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:44:58 crc kubenswrapper[4699]: I1122 04:44:58.383551 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ca12153-56c2-4ff7-af24-d20a1d85c90b-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:44:58 crc kubenswrapper[4699]: I1122 04:44:58.383562 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pb66\" (UniqueName: \"kubernetes.io/projected/2ca12153-56c2-4ff7-af24-d20a1d85c90b-kube-api-access-2pb66\") on node \"crc\" DevicePath \"\"" Nov 22 04:44:59 crc kubenswrapper[4699]: I1122 04:44:59.110365 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6xdwr" Nov 22 04:44:59 crc kubenswrapper[4699]: I1122 04:44:59.161577 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6xdwr"] Nov 22 04:44:59 crc kubenswrapper[4699]: I1122 04:44:59.169997 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6xdwr"] Nov 22 04:44:59 crc kubenswrapper[4699]: I1122 04:44:59.472545 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ca12153-56c2-4ff7-af24-d20a1d85c90b" path="/var/lib/kubelet/pods/2ca12153-56c2-4ff7-af24-d20a1d85c90b/volumes" Nov 22 04:45:00 crc kubenswrapper[4699]: I1122 04:45:00.169639 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396445-bgzkk"] Nov 22 04:45:00 crc kubenswrapper[4699]: E1122 04:45:00.170012 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ca12153-56c2-4ff7-af24-d20a1d85c90b" containerName="registry-server" Nov 22 04:45:00 crc kubenswrapper[4699]: I1122 04:45:00.170023 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ca12153-56c2-4ff7-af24-d20a1d85c90b" containerName="registry-server" Nov 22 04:45:00 crc kubenswrapper[4699]: E1122 04:45:00.170056 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ca12153-56c2-4ff7-af24-d20a1d85c90b" containerName="extract-utilities" Nov 22 04:45:00 crc kubenswrapper[4699]: I1122 04:45:00.170061 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ca12153-56c2-4ff7-af24-d20a1d85c90b" containerName="extract-utilities" Nov 22 04:45:00 crc kubenswrapper[4699]: E1122 04:45:00.170071 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ca12153-56c2-4ff7-af24-d20a1d85c90b" containerName="extract-content" Nov 22 04:45:00 crc kubenswrapper[4699]: I1122 04:45:00.170079 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ca12153-56c2-4ff7-af24-d20a1d85c90b" containerName="extract-content" Nov 22 04:45:00 crc kubenswrapper[4699]: I1122 04:45:00.170253 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ca12153-56c2-4ff7-af24-d20a1d85c90b" containerName="registry-server" Nov 22 04:45:00 crc kubenswrapper[4699]: I1122 04:45:00.170942 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396445-bgzkk" Nov 22 04:45:00 crc kubenswrapper[4699]: I1122 04:45:00.175222 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 22 04:45:00 crc kubenswrapper[4699]: I1122 04:45:00.182817 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396445-bgzkk"] Nov 22 04:45:00 crc kubenswrapper[4699]: I1122 04:45:00.183096 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 22 04:45:00 crc kubenswrapper[4699]: I1122 04:45:00.235746 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/66f0d84d-72aa-4444-a316-3ac0daa874e9-secret-volume\") pod \"collect-profiles-29396445-bgzkk\" (UID: \"66f0d84d-72aa-4444-a316-3ac0daa874e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396445-bgzkk" Nov 22 04:45:00 crc kubenswrapper[4699]: I1122 04:45:00.235836 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/66f0d84d-72aa-4444-a316-3ac0daa874e9-config-volume\") pod \"collect-profiles-29396445-bgzkk\" (UID: \"66f0d84d-72aa-4444-a316-3ac0daa874e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396445-bgzkk" Nov 22 04:45:00 crc kubenswrapper[4699]: I1122 04:45:00.235985 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvgg4\" (UniqueName: \"kubernetes.io/projected/66f0d84d-72aa-4444-a316-3ac0daa874e9-kube-api-access-fvgg4\") pod \"collect-profiles-29396445-bgzkk\" (UID: \"66f0d84d-72aa-4444-a316-3ac0daa874e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396445-bgzkk" Nov 22 04:45:00 crc kubenswrapper[4699]: I1122 04:45:00.337986 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/66f0d84d-72aa-4444-a316-3ac0daa874e9-secret-volume\") pod \"collect-profiles-29396445-bgzkk\" (UID: \"66f0d84d-72aa-4444-a316-3ac0daa874e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396445-bgzkk" Nov 22 04:45:00 crc kubenswrapper[4699]: I1122 04:45:00.338041 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/66f0d84d-72aa-4444-a316-3ac0daa874e9-config-volume\") pod \"collect-profiles-29396445-bgzkk\" (UID: \"66f0d84d-72aa-4444-a316-3ac0daa874e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396445-bgzkk" Nov 22 04:45:00 crc kubenswrapper[4699]: I1122 04:45:00.338116 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvgg4\" (UniqueName: \"kubernetes.io/projected/66f0d84d-72aa-4444-a316-3ac0daa874e9-kube-api-access-fvgg4\") pod \"collect-profiles-29396445-bgzkk\" (UID: \"66f0d84d-72aa-4444-a316-3ac0daa874e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396445-bgzkk" Nov 22 04:45:00 crc kubenswrapper[4699]: I1122 04:45:00.340171 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/66f0d84d-72aa-4444-a316-3ac0daa874e9-config-volume\") pod \"collect-profiles-29396445-bgzkk\" (UID: \"66f0d84d-72aa-4444-a316-3ac0daa874e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396445-bgzkk" Nov 22 04:45:00 crc kubenswrapper[4699]: I1122 04:45:00.353168 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/66f0d84d-72aa-4444-a316-3ac0daa874e9-secret-volume\") pod \"collect-profiles-29396445-bgzkk\" (UID: \"66f0d84d-72aa-4444-a316-3ac0daa874e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396445-bgzkk" Nov 22 04:45:00 crc kubenswrapper[4699]: I1122 04:45:00.358545 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvgg4\" (UniqueName: \"kubernetes.io/projected/66f0d84d-72aa-4444-a316-3ac0daa874e9-kube-api-access-fvgg4\") pod \"collect-profiles-29396445-bgzkk\" (UID: \"66f0d84d-72aa-4444-a316-3ac0daa874e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396445-bgzkk" Nov 22 04:45:00 crc kubenswrapper[4699]: I1122 04:45:00.514277 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396445-bgzkk" Nov 22 04:45:00 crc kubenswrapper[4699]: I1122 04:45:00.968607 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396445-bgzkk"] Nov 22 04:45:01 crc kubenswrapper[4699]: I1122 04:45:01.125959 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396445-bgzkk" event={"ID":"66f0d84d-72aa-4444-a316-3ac0daa874e9","Type":"ContainerStarted","Data":"2a5d9db5f29ac84e8822218481f0f66d688dd644a8444d5cc019ec75800396da"} Nov 22 04:45:02 crc kubenswrapper[4699]: I1122 04:45:02.135630 4699 generic.go:334] "Generic (PLEG): container finished" podID="66f0d84d-72aa-4444-a316-3ac0daa874e9" containerID="03fde75474da2889303f4bef2e6a98b6b83ccff6bd8e6eb4a94e033673d0d307" exitCode=0 Nov 22 04:45:02 crc kubenswrapper[4699]: I1122 04:45:02.135704 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396445-bgzkk" event={"ID":"66f0d84d-72aa-4444-a316-3ac0daa874e9","Type":"ContainerDied","Data":"03fde75474da2889303f4bef2e6a98b6b83ccff6bd8e6eb4a94e033673d0d307"} Nov 22 04:45:03 crc kubenswrapper[4699]: I1122 04:45:03.521257 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396445-bgzkk" Nov 22 04:45:03 crc kubenswrapper[4699]: I1122 04:45:03.601265 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvgg4\" (UniqueName: \"kubernetes.io/projected/66f0d84d-72aa-4444-a316-3ac0daa874e9-kube-api-access-fvgg4\") pod \"66f0d84d-72aa-4444-a316-3ac0daa874e9\" (UID: \"66f0d84d-72aa-4444-a316-3ac0daa874e9\") " Nov 22 04:45:03 crc kubenswrapper[4699]: I1122 04:45:03.601725 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/66f0d84d-72aa-4444-a316-3ac0daa874e9-config-volume\") pod \"66f0d84d-72aa-4444-a316-3ac0daa874e9\" (UID: \"66f0d84d-72aa-4444-a316-3ac0daa874e9\") " Nov 22 04:45:03 crc kubenswrapper[4699]: I1122 04:45:03.601798 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/66f0d84d-72aa-4444-a316-3ac0daa874e9-secret-volume\") pod \"66f0d84d-72aa-4444-a316-3ac0daa874e9\" (UID: \"66f0d84d-72aa-4444-a316-3ac0daa874e9\") " Nov 22 04:45:03 crc kubenswrapper[4699]: I1122 04:45:03.602459 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66f0d84d-72aa-4444-a316-3ac0daa874e9-config-volume" (OuterVolumeSpecName: "config-volume") pod "66f0d84d-72aa-4444-a316-3ac0daa874e9" (UID: "66f0d84d-72aa-4444-a316-3ac0daa874e9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 04:45:03 crc kubenswrapper[4699]: I1122 04:45:03.617993 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66f0d84d-72aa-4444-a316-3ac0daa874e9-kube-api-access-fvgg4" (OuterVolumeSpecName: "kube-api-access-fvgg4") pod "66f0d84d-72aa-4444-a316-3ac0daa874e9" (UID: "66f0d84d-72aa-4444-a316-3ac0daa874e9"). InnerVolumeSpecName "kube-api-access-fvgg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:45:03 crc kubenswrapper[4699]: I1122 04:45:03.618240 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66f0d84d-72aa-4444-a316-3ac0daa874e9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "66f0d84d-72aa-4444-a316-3ac0daa874e9" (UID: "66f0d84d-72aa-4444-a316-3ac0daa874e9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 04:45:03 crc kubenswrapper[4699]: I1122 04:45:03.703741 4699 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/66f0d84d-72aa-4444-a316-3ac0daa874e9-config-volume\") on node \"crc\" DevicePath \"\"" Nov 22 04:45:03 crc kubenswrapper[4699]: I1122 04:45:03.703770 4699 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/66f0d84d-72aa-4444-a316-3ac0daa874e9-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 22 04:45:03 crc kubenswrapper[4699]: I1122 04:45:03.703782 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvgg4\" (UniqueName: \"kubernetes.io/projected/66f0d84d-72aa-4444-a316-3ac0daa874e9-kube-api-access-fvgg4\") on node \"crc\" DevicePath \"\"" Nov 22 04:45:04 crc kubenswrapper[4699]: I1122 04:45:04.153835 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396445-bgzkk" event={"ID":"66f0d84d-72aa-4444-a316-3ac0daa874e9","Type":"ContainerDied","Data":"2a5d9db5f29ac84e8822218481f0f66d688dd644a8444d5cc019ec75800396da"} Nov 22 04:45:04 crc kubenswrapper[4699]: I1122 04:45:04.153881 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a5d9db5f29ac84e8822218481f0f66d688dd644a8444d5cc019ec75800396da" Nov 22 04:45:04 crc kubenswrapper[4699]: I1122 04:45:04.153881 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396445-bgzkk" Nov 22 04:45:04 crc kubenswrapper[4699]: I1122 04:45:04.593057 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396400-cqt2c"] Nov 22 04:45:04 crc kubenswrapper[4699]: I1122 04:45:04.599230 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396400-cqt2c"] Nov 22 04:45:05 crc kubenswrapper[4699]: I1122 04:45:05.474327 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8098591-7b9f-4330-90f0-4181570d05b3" path="/var/lib/kubelet/pods/a8098591-7b9f-4330-90f0-4181570d05b3/volumes" Nov 22 04:45:37 crc kubenswrapper[4699]: I1122 04:45:37.623370 4699 scope.go:117] "RemoveContainer" containerID="3070b298889ac61c182a8dac62b5a28340c139baa0069242281ba8678c51c2b4" Nov 22 04:45:37 crc kubenswrapper[4699]: I1122 04:45:37.652334 4699 scope.go:117] "RemoveContainer" containerID="96653f0a01f146e190b3a512f954078f8b6f50501d0d46839566d3eaa15e5b34" Nov 22 04:46:06 crc kubenswrapper[4699]: I1122 04:46:06.781546 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kg9ms/must-gather-dt6xg"] Nov 22 04:46:06 crc kubenswrapper[4699]: E1122 04:46:06.782545 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66f0d84d-72aa-4444-a316-3ac0daa874e9" containerName="collect-profiles" Nov 22 04:46:06 crc kubenswrapper[4699]: I1122 04:46:06.782559 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="66f0d84d-72aa-4444-a316-3ac0daa874e9" containerName="collect-profiles" Nov 22 04:46:06 crc kubenswrapper[4699]: I1122 04:46:06.782763 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="66f0d84d-72aa-4444-a316-3ac0daa874e9" containerName="collect-profiles" Nov 22 04:46:06 crc kubenswrapper[4699]: I1122 04:46:06.783934 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kg9ms/must-gather-dt6xg" Nov 22 04:46:06 crc kubenswrapper[4699]: I1122 04:46:06.787143 4699 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-kg9ms"/"default-dockercfg-spvc5" Nov 22 04:46:06 crc kubenswrapper[4699]: I1122 04:46:06.787539 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-kg9ms"/"kube-root-ca.crt" Nov 22 04:46:06 crc kubenswrapper[4699]: I1122 04:46:06.787768 4699 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-kg9ms"/"openshift-service-ca.crt" Nov 22 04:46:06 crc kubenswrapper[4699]: I1122 04:46:06.804306 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kg9ms/must-gather-dt6xg"] Nov 22 04:46:06 crc kubenswrapper[4699]: I1122 04:46:06.915238 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzt9z\" (UniqueName: \"kubernetes.io/projected/550ab800-e31d-4c0d-8bbc-6538ce378e8e-kube-api-access-gzt9z\") pod \"must-gather-dt6xg\" (UID: \"550ab800-e31d-4c0d-8bbc-6538ce378e8e\") " pod="openshift-must-gather-kg9ms/must-gather-dt6xg" Nov 22 04:46:06 crc kubenswrapper[4699]: I1122 04:46:06.915423 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/550ab800-e31d-4c0d-8bbc-6538ce378e8e-must-gather-output\") pod \"must-gather-dt6xg\" (UID: \"550ab800-e31d-4c0d-8bbc-6538ce378e8e\") " pod="openshift-must-gather-kg9ms/must-gather-dt6xg" Nov 22 04:46:07 crc kubenswrapper[4699]: I1122 04:46:07.017775 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzt9z\" (UniqueName: \"kubernetes.io/projected/550ab800-e31d-4c0d-8bbc-6538ce378e8e-kube-api-access-gzt9z\") pod \"must-gather-dt6xg\" (UID: \"550ab800-e31d-4c0d-8bbc-6538ce378e8e\") " pod="openshift-must-gather-kg9ms/must-gather-dt6xg" Nov 22 04:46:07 crc kubenswrapper[4699]: I1122 04:46:07.017846 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/550ab800-e31d-4c0d-8bbc-6538ce378e8e-must-gather-output\") pod \"must-gather-dt6xg\" (UID: \"550ab800-e31d-4c0d-8bbc-6538ce378e8e\") " pod="openshift-must-gather-kg9ms/must-gather-dt6xg" Nov 22 04:46:07 crc kubenswrapper[4699]: I1122 04:46:07.018319 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/550ab800-e31d-4c0d-8bbc-6538ce378e8e-must-gather-output\") pod \"must-gather-dt6xg\" (UID: \"550ab800-e31d-4c0d-8bbc-6538ce378e8e\") " pod="openshift-must-gather-kg9ms/must-gather-dt6xg" Nov 22 04:46:07 crc kubenswrapper[4699]: I1122 04:46:07.050396 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzt9z\" (UniqueName: \"kubernetes.io/projected/550ab800-e31d-4c0d-8bbc-6538ce378e8e-kube-api-access-gzt9z\") pod \"must-gather-dt6xg\" (UID: \"550ab800-e31d-4c0d-8bbc-6538ce378e8e\") " pod="openshift-must-gather-kg9ms/must-gather-dt6xg" Nov 22 04:46:07 crc kubenswrapper[4699]: I1122 04:46:07.119760 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kg9ms/must-gather-dt6xg" Nov 22 04:46:07 crc kubenswrapper[4699]: I1122 04:46:07.640060 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kg9ms/must-gather-dt6xg"] Nov 22 04:46:07 crc kubenswrapper[4699]: I1122 04:46:07.790524 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kg9ms/must-gather-dt6xg" event={"ID":"550ab800-e31d-4c0d-8bbc-6538ce378e8e","Type":"ContainerStarted","Data":"35318d4270bd02e3475c21dee836483585452c0a08591ce4b71070b36dbb1614"} Nov 22 04:46:08 crc kubenswrapper[4699]: I1122 04:46:08.811717 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kg9ms/must-gather-dt6xg" event={"ID":"550ab800-e31d-4c0d-8bbc-6538ce378e8e","Type":"ContainerStarted","Data":"45d1d8e94f72ffe58772f50e0a4f73691a142d310516f8644240a5ad77066ebf"} Nov 22 04:46:08 crc kubenswrapper[4699]: I1122 04:46:08.812312 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kg9ms/must-gather-dt6xg" event={"ID":"550ab800-e31d-4c0d-8bbc-6538ce378e8e","Type":"ContainerStarted","Data":"083cc5ad80d11505455451c3240f9aff38dda9a72398afd9e2452193c2c87e88"} Nov 22 04:46:08 crc kubenswrapper[4699]: I1122 04:46:08.838713 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kg9ms/must-gather-dt6xg" podStartSLOduration=2.838689301 podStartE2EDuration="2.838689301s" podCreationTimestamp="2025-11-22 04:46:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 04:46:08.825808755 +0000 UTC m=+2320.168429992" watchObservedRunningTime="2025-11-22 04:46:08.838689301 +0000 UTC m=+2320.181310498" Nov 22 04:46:11 crc kubenswrapper[4699]: I1122 04:46:11.438739 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kg9ms/crc-debug-lmnk8"] Nov 22 04:46:11 crc kubenswrapper[4699]: I1122 04:46:11.439999 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kg9ms/crc-debug-lmnk8" Nov 22 04:46:11 crc kubenswrapper[4699]: I1122 04:46:11.613750 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvfnf\" (UniqueName: \"kubernetes.io/projected/81e7617b-2f1d-4cfb-91af-b2ea933aa17b-kube-api-access-qvfnf\") pod \"crc-debug-lmnk8\" (UID: \"81e7617b-2f1d-4cfb-91af-b2ea933aa17b\") " pod="openshift-must-gather-kg9ms/crc-debug-lmnk8" Nov 22 04:46:11 crc kubenswrapper[4699]: I1122 04:46:11.614207 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/81e7617b-2f1d-4cfb-91af-b2ea933aa17b-host\") pod \"crc-debug-lmnk8\" (UID: \"81e7617b-2f1d-4cfb-91af-b2ea933aa17b\") " pod="openshift-must-gather-kg9ms/crc-debug-lmnk8" Nov 22 04:46:11 crc kubenswrapper[4699]: I1122 04:46:11.715561 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/81e7617b-2f1d-4cfb-91af-b2ea933aa17b-host\") pod \"crc-debug-lmnk8\" (UID: \"81e7617b-2f1d-4cfb-91af-b2ea933aa17b\") " pod="openshift-must-gather-kg9ms/crc-debug-lmnk8" Nov 22 04:46:11 crc kubenswrapper[4699]: I1122 04:46:11.715690 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/81e7617b-2f1d-4cfb-91af-b2ea933aa17b-host\") pod \"crc-debug-lmnk8\" (UID: \"81e7617b-2f1d-4cfb-91af-b2ea933aa17b\") " pod="openshift-must-gather-kg9ms/crc-debug-lmnk8" Nov 22 04:46:11 crc kubenswrapper[4699]: I1122 04:46:11.715713 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvfnf\" (UniqueName: \"kubernetes.io/projected/81e7617b-2f1d-4cfb-91af-b2ea933aa17b-kube-api-access-qvfnf\") pod \"crc-debug-lmnk8\" (UID: \"81e7617b-2f1d-4cfb-91af-b2ea933aa17b\") " pod="openshift-must-gather-kg9ms/crc-debug-lmnk8" Nov 22 04:46:11 crc kubenswrapper[4699]: I1122 04:46:11.736459 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvfnf\" (UniqueName: \"kubernetes.io/projected/81e7617b-2f1d-4cfb-91af-b2ea933aa17b-kube-api-access-qvfnf\") pod \"crc-debug-lmnk8\" (UID: \"81e7617b-2f1d-4cfb-91af-b2ea933aa17b\") " pod="openshift-must-gather-kg9ms/crc-debug-lmnk8" Nov 22 04:46:11 crc kubenswrapper[4699]: I1122 04:46:11.761172 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kg9ms/crc-debug-lmnk8" Nov 22 04:46:11 crc kubenswrapper[4699]: I1122 04:46:11.856015 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kg9ms/crc-debug-lmnk8" event={"ID":"81e7617b-2f1d-4cfb-91af-b2ea933aa17b","Type":"ContainerStarted","Data":"c799ddb0b062115b22ef53484c9c4c1add516c26fc8a1edf5315f9d0494af8d4"} Nov 22 04:46:12 crc kubenswrapper[4699]: I1122 04:46:12.866944 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kg9ms/crc-debug-lmnk8" event={"ID":"81e7617b-2f1d-4cfb-91af-b2ea933aa17b","Type":"ContainerStarted","Data":"4f167655a449b518fcd5349399944a03fac418b319b47bae385e032bde3ce6fe"} Nov 22 04:46:38 crc kubenswrapper[4699]: I1122 04:46:38.726394 4699 patch_prober.go:28] interesting pod/machine-config-daemon-kjwnt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:46:38 crc kubenswrapper[4699]: I1122 04:46:38.727105 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:46:43 crc kubenswrapper[4699]: I1122 04:46:43.107023 4699 generic.go:334] "Generic (PLEG): container finished" podID="81e7617b-2f1d-4cfb-91af-b2ea933aa17b" containerID="4f167655a449b518fcd5349399944a03fac418b319b47bae385e032bde3ce6fe" exitCode=0 Nov 22 04:46:43 crc kubenswrapper[4699]: I1122 04:46:43.107121 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kg9ms/crc-debug-lmnk8" event={"ID":"81e7617b-2f1d-4cfb-91af-b2ea933aa17b","Type":"ContainerDied","Data":"4f167655a449b518fcd5349399944a03fac418b319b47bae385e032bde3ce6fe"} Nov 22 04:46:44 crc kubenswrapper[4699]: I1122 04:46:44.227496 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kg9ms/crc-debug-lmnk8" Nov 22 04:46:44 crc kubenswrapper[4699]: I1122 04:46:44.262931 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kg9ms/crc-debug-lmnk8"] Nov 22 04:46:44 crc kubenswrapper[4699]: I1122 04:46:44.272136 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kg9ms/crc-debug-lmnk8"] Nov 22 04:46:44 crc kubenswrapper[4699]: I1122 04:46:44.396581 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvfnf\" (UniqueName: \"kubernetes.io/projected/81e7617b-2f1d-4cfb-91af-b2ea933aa17b-kube-api-access-qvfnf\") pod \"81e7617b-2f1d-4cfb-91af-b2ea933aa17b\" (UID: \"81e7617b-2f1d-4cfb-91af-b2ea933aa17b\") " Nov 22 04:46:44 crc kubenswrapper[4699]: I1122 04:46:44.396667 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/81e7617b-2f1d-4cfb-91af-b2ea933aa17b-host\") pod \"81e7617b-2f1d-4cfb-91af-b2ea933aa17b\" (UID: \"81e7617b-2f1d-4cfb-91af-b2ea933aa17b\") " Nov 22 04:46:44 crc kubenswrapper[4699]: I1122 04:46:44.396719 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/81e7617b-2f1d-4cfb-91af-b2ea933aa17b-host" (OuterVolumeSpecName: "host") pod "81e7617b-2f1d-4cfb-91af-b2ea933aa17b" (UID: "81e7617b-2f1d-4cfb-91af-b2ea933aa17b"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:46:44 crc kubenswrapper[4699]: I1122 04:46:44.397291 4699 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/81e7617b-2f1d-4cfb-91af-b2ea933aa17b-host\") on node \"crc\" DevicePath \"\"" Nov 22 04:46:44 crc kubenswrapper[4699]: I1122 04:46:44.402107 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81e7617b-2f1d-4cfb-91af-b2ea933aa17b-kube-api-access-qvfnf" (OuterVolumeSpecName: "kube-api-access-qvfnf") pod "81e7617b-2f1d-4cfb-91af-b2ea933aa17b" (UID: "81e7617b-2f1d-4cfb-91af-b2ea933aa17b"). InnerVolumeSpecName "kube-api-access-qvfnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:46:44 crc kubenswrapper[4699]: I1122 04:46:44.499062 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvfnf\" (UniqueName: \"kubernetes.io/projected/81e7617b-2f1d-4cfb-91af-b2ea933aa17b-kube-api-access-qvfnf\") on node \"crc\" DevicePath \"\"" Nov 22 04:46:45 crc kubenswrapper[4699]: I1122 04:46:45.128880 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c799ddb0b062115b22ef53484c9c4c1add516c26fc8a1edf5315f9d0494af8d4" Nov 22 04:46:45 crc kubenswrapper[4699]: I1122 04:46:45.128970 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kg9ms/crc-debug-lmnk8" Nov 22 04:46:45 crc kubenswrapper[4699]: I1122 04:46:45.463806 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81e7617b-2f1d-4cfb-91af-b2ea933aa17b" path="/var/lib/kubelet/pods/81e7617b-2f1d-4cfb-91af-b2ea933aa17b/volumes" Nov 22 04:46:45 crc kubenswrapper[4699]: I1122 04:46:45.516760 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kg9ms/crc-debug-l5chq"] Nov 22 04:46:45 crc kubenswrapper[4699]: E1122 04:46:45.517552 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81e7617b-2f1d-4cfb-91af-b2ea933aa17b" containerName="container-00" Nov 22 04:46:45 crc kubenswrapper[4699]: I1122 04:46:45.517569 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="81e7617b-2f1d-4cfb-91af-b2ea933aa17b" containerName="container-00" Nov 22 04:46:45 crc kubenswrapper[4699]: I1122 04:46:45.517771 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="81e7617b-2f1d-4cfb-91af-b2ea933aa17b" containerName="container-00" Nov 22 04:46:45 crc kubenswrapper[4699]: I1122 04:46:45.518403 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kg9ms/crc-debug-l5chq" Nov 22 04:46:45 crc kubenswrapper[4699]: I1122 04:46:45.620403 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s949\" (UniqueName: \"kubernetes.io/projected/6f3be98b-17bb-46a6-9221-874ed5c69c75-kube-api-access-8s949\") pod \"crc-debug-l5chq\" (UID: \"6f3be98b-17bb-46a6-9221-874ed5c69c75\") " pod="openshift-must-gather-kg9ms/crc-debug-l5chq" Nov 22 04:46:45 crc kubenswrapper[4699]: I1122 04:46:45.620643 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6f3be98b-17bb-46a6-9221-874ed5c69c75-host\") pod \"crc-debug-l5chq\" (UID: \"6f3be98b-17bb-46a6-9221-874ed5c69c75\") " pod="openshift-must-gather-kg9ms/crc-debug-l5chq" Nov 22 04:46:45 crc kubenswrapper[4699]: I1122 04:46:45.722783 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8s949\" (UniqueName: \"kubernetes.io/projected/6f3be98b-17bb-46a6-9221-874ed5c69c75-kube-api-access-8s949\") pod \"crc-debug-l5chq\" (UID: \"6f3be98b-17bb-46a6-9221-874ed5c69c75\") " pod="openshift-must-gather-kg9ms/crc-debug-l5chq" Nov 22 04:46:45 crc kubenswrapper[4699]: I1122 04:46:45.722872 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6f3be98b-17bb-46a6-9221-874ed5c69c75-host\") pod \"crc-debug-l5chq\" (UID: \"6f3be98b-17bb-46a6-9221-874ed5c69c75\") " pod="openshift-must-gather-kg9ms/crc-debug-l5chq" Nov 22 04:46:45 crc kubenswrapper[4699]: I1122 04:46:45.722972 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6f3be98b-17bb-46a6-9221-874ed5c69c75-host\") pod \"crc-debug-l5chq\" (UID: \"6f3be98b-17bb-46a6-9221-874ed5c69c75\") " pod="openshift-must-gather-kg9ms/crc-debug-l5chq" Nov 22 04:46:45 crc kubenswrapper[4699]: I1122 04:46:45.751403 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s949\" (UniqueName: \"kubernetes.io/projected/6f3be98b-17bb-46a6-9221-874ed5c69c75-kube-api-access-8s949\") pod \"crc-debug-l5chq\" (UID: \"6f3be98b-17bb-46a6-9221-874ed5c69c75\") " pod="openshift-must-gather-kg9ms/crc-debug-l5chq" Nov 22 04:46:45 crc kubenswrapper[4699]: I1122 04:46:45.841255 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kg9ms/crc-debug-l5chq" Nov 22 04:46:46 crc kubenswrapper[4699]: I1122 04:46:46.143121 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kg9ms/crc-debug-l5chq" event={"ID":"6f3be98b-17bb-46a6-9221-874ed5c69c75","Type":"ContainerStarted","Data":"b42fec4bb73e201237e59b82b362d25504e25bd68d48b71949ef248cbbb77ef6"} Nov 22 04:46:47 crc kubenswrapper[4699]: I1122 04:46:47.155729 4699 generic.go:334] "Generic (PLEG): container finished" podID="6f3be98b-17bb-46a6-9221-874ed5c69c75" containerID="2b3231061f3317da9b61b189ccfb20ba35f7c8e24db57446eadd05c67539892e" exitCode=0 Nov 22 04:46:47 crc kubenswrapper[4699]: I1122 04:46:47.155782 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kg9ms/crc-debug-l5chq" event={"ID":"6f3be98b-17bb-46a6-9221-874ed5c69c75","Type":"ContainerDied","Data":"2b3231061f3317da9b61b189ccfb20ba35f7c8e24db57446eadd05c67539892e"} Nov 22 04:46:47 crc kubenswrapper[4699]: I1122 04:46:47.750504 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kg9ms/crc-debug-l5chq"] Nov 22 04:46:47 crc kubenswrapper[4699]: I1122 04:46:47.759270 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kg9ms/crc-debug-l5chq"] Nov 22 04:46:48 crc kubenswrapper[4699]: I1122 04:46:48.264347 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kg9ms/crc-debug-l5chq" Nov 22 04:46:48 crc kubenswrapper[4699]: I1122 04:46:48.382662 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8s949\" (UniqueName: \"kubernetes.io/projected/6f3be98b-17bb-46a6-9221-874ed5c69c75-kube-api-access-8s949\") pod \"6f3be98b-17bb-46a6-9221-874ed5c69c75\" (UID: \"6f3be98b-17bb-46a6-9221-874ed5c69c75\") " Nov 22 04:46:48 crc kubenswrapper[4699]: I1122 04:46:48.382866 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6f3be98b-17bb-46a6-9221-874ed5c69c75-host\") pod \"6f3be98b-17bb-46a6-9221-874ed5c69c75\" (UID: \"6f3be98b-17bb-46a6-9221-874ed5c69c75\") " Nov 22 04:46:48 crc kubenswrapper[4699]: I1122 04:46:48.383118 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6f3be98b-17bb-46a6-9221-874ed5c69c75-host" (OuterVolumeSpecName: "host") pod "6f3be98b-17bb-46a6-9221-874ed5c69c75" (UID: "6f3be98b-17bb-46a6-9221-874ed5c69c75"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:46:48 crc kubenswrapper[4699]: I1122 04:46:48.383298 4699 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6f3be98b-17bb-46a6-9221-874ed5c69c75-host\") on node \"crc\" DevicePath \"\"" Nov 22 04:46:48 crc kubenswrapper[4699]: I1122 04:46:48.390627 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f3be98b-17bb-46a6-9221-874ed5c69c75-kube-api-access-8s949" (OuterVolumeSpecName: "kube-api-access-8s949") pod "6f3be98b-17bb-46a6-9221-874ed5c69c75" (UID: "6f3be98b-17bb-46a6-9221-874ed5c69c75"). InnerVolumeSpecName "kube-api-access-8s949". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:46:48 crc kubenswrapper[4699]: I1122 04:46:48.484662 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8s949\" (UniqueName: \"kubernetes.io/projected/6f3be98b-17bb-46a6-9221-874ed5c69c75-kube-api-access-8s949\") on node \"crc\" DevicePath \"\"" Nov 22 04:46:48 crc kubenswrapper[4699]: I1122 04:46:48.908177 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kg9ms/crc-debug-dqtzd"] Nov 22 04:46:48 crc kubenswrapper[4699]: E1122 04:46:48.908646 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f3be98b-17bb-46a6-9221-874ed5c69c75" containerName="container-00" Nov 22 04:46:48 crc kubenswrapper[4699]: I1122 04:46:48.908666 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f3be98b-17bb-46a6-9221-874ed5c69c75" containerName="container-00" Nov 22 04:46:48 crc kubenswrapper[4699]: I1122 04:46:48.908848 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f3be98b-17bb-46a6-9221-874ed5c69c75" containerName="container-00" Nov 22 04:46:48 crc kubenswrapper[4699]: I1122 04:46:48.909458 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kg9ms/crc-debug-dqtzd" Nov 22 04:46:49 crc kubenswrapper[4699]: I1122 04:46:49.094994 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6v88\" (UniqueName: \"kubernetes.io/projected/fcc1b4ec-d5fe-4a55-bb8d-fdbcb12db6e0-kube-api-access-q6v88\") pod \"crc-debug-dqtzd\" (UID: \"fcc1b4ec-d5fe-4a55-bb8d-fdbcb12db6e0\") " pod="openshift-must-gather-kg9ms/crc-debug-dqtzd" Nov 22 04:46:49 crc kubenswrapper[4699]: I1122 04:46:49.095332 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fcc1b4ec-d5fe-4a55-bb8d-fdbcb12db6e0-host\") pod \"crc-debug-dqtzd\" (UID: \"fcc1b4ec-d5fe-4a55-bb8d-fdbcb12db6e0\") " pod="openshift-must-gather-kg9ms/crc-debug-dqtzd" Nov 22 04:46:49 crc kubenswrapper[4699]: I1122 04:46:49.172222 4699 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b42fec4bb73e201237e59b82b362d25504e25bd68d48b71949ef248cbbb77ef6" Nov 22 04:46:49 crc kubenswrapper[4699]: I1122 04:46:49.172288 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kg9ms/crc-debug-l5chq" Nov 22 04:46:49 crc kubenswrapper[4699]: I1122 04:46:49.197325 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6v88\" (UniqueName: \"kubernetes.io/projected/fcc1b4ec-d5fe-4a55-bb8d-fdbcb12db6e0-kube-api-access-q6v88\") pod \"crc-debug-dqtzd\" (UID: \"fcc1b4ec-d5fe-4a55-bb8d-fdbcb12db6e0\") " pod="openshift-must-gather-kg9ms/crc-debug-dqtzd" Nov 22 04:46:49 crc kubenswrapper[4699]: I1122 04:46:49.197403 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fcc1b4ec-d5fe-4a55-bb8d-fdbcb12db6e0-host\") pod \"crc-debug-dqtzd\" (UID: \"fcc1b4ec-d5fe-4a55-bb8d-fdbcb12db6e0\") " pod="openshift-must-gather-kg9ms/crc-debug-dqtzd" Nov 22 04:46:49 crc kubenswrapper[4699]: I1122 04:46:49.197487 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fcc1b4ec-d5fe-4a55-bb8d-fdbcb12db6e0-host\") pod \"crc-debug-dqtzd\" (UID: \"fcc1b4ec-d5fe-4a55-bb8d-fdbcb12db6e0\") " pod="openshift-must-gather-kg9ms/crc-debug-dqtzd" Nov 22 04:46:49 crc kubenswrapper[4699]: I1122 04:46:49.221796 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6v88\" (UniqueName: \"kubernetes.io/projected/fcc1b4ec-d5fe-4a55-bb8d-fdbcb12db6e0-kube-api-access-q6v88\") pod \"crc-debug-dqtzd\" (UID: \"fcc1b4ec-d5fe-4a55-bb8d-fdbcb12db6e0\") " pod="openshift-must-gather-kg9ms/crc-debug-dqtzd" Nov 22 04:46:49 crc kubenswrapper[4699]: I1122 04:46:49.227715 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kg9ms/crc-debug-dqtzd" Nov 22 04:46:49 crc kubenswrapper[4699]: I1122 04:46:49.464368 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f3be98b-17bb-46a6-9221-874ed5c69c75" path="/var/lib/kubelet/pods/6f3be98b-17bb-46a6-9221-874ed5c69c75/volumes" Nov 22 04:46:50 crc kubenswrapper[4699]: I1122 04:46:50.189350 4699 generic.go:334] "Generic (PLEG): container finished" podID="fcc1b4ec-d5fe-4a55-bb8d-fdbcb12db6e0" containerID="cab6dc165a1cec837d1baf8141c2c47eeec419361fdec5aff51c243ba63ba76c" exitCode=0 Nov 22 04:46:50 crc kubenswrapper[4699]: I1122 04:46:50.189483 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kg9ms/crc-debug-dqtzd" event={"ID":"fcc1b4ec-d5fe-4a55-bb8d-fdbcb12db6e0","Type":"ContainerDied","Data":"cab6dc165a1cec837d1baf8141c2c47eeec419361fdec5aff51c243ba63ba76c"} Nov 22 04:46:50 crc kubenswrapper[4699]: I1122 04:46:50.189934 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kg9ms/crc-debug-dqtzd" event={"ID":"fcc1b4ec-d5fe-4a55-bb8d-fdbcb12db6e0","Type":"ContainerStarted","Data":"41849bdae5ccc95d7856dd7aae5e5d7a49ed49394a446d8eba730e3879324d6f"} Nov 22 04:46:50 crc kubenswrapper[4699]: I1122 04:46:50.232188 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kg9ms/crc-debug-dqtzd"] Nov 22 04:46:50 crc kubenswrapper[4699]: I1122 04:46:50.239489 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kg9ms/crc-debug-dqtzd"] Nov 22 04:46:51 crc kubenswrapper[4699]: I1122 04:46:51.288940 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kg9ms/crc-debug-dqtzd" Nov 22 04:46:51 crc kubenswrapper[4699]: I1122 04:46:51.439145 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6v88\" (UniqueName: \"kubernetes.io/projected/fcc1b4ec-d5fe-4a55-bb8d-fdbcb12db6e0-kube-api-access-q6v88\") pod \"fcc1b4ec-d5fe-4a55-bb8d-fdbcb12db6e0\" (UID: \"fcc1b4ec-d5fe-4a55-bb8d-fdbcb12db6e0\") " Nov 22 04:46:51 crc kubenswrapper[4699]: I1122 04:46:51.439208 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fcc1b4ec-d5fe-4a55-bb8d-fdbcb12db6e0-host\") pod \"fcc1b4ec-d5fe-4a55-bb8d-fdbcb12db6e0\" (UID: \"fcc1b4ec-d5fe-4a55-bb8d-fdbcb12db6e0\") " Nov 22 04:46:51 crc kubenswrapper[4699]: I1122 04:46:51.439381 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fcc1b4ec-d5fe-4a55-bb8d-fdbcb12db6e0-host" (OuterVolumeSpecName: "host") pod "fcc1b4ec-d5fe-4a55-bb8d-fdbcb12db6e0" (UID: "fcc1b4ec-d5fe-4a55-bb8d-fdbcb12db6e0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 04:46:51 crc kubenswrapper[4699]: I1122 04:46:51.439762 4699 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fcc1b4ec-d5fe-4a55-bb8d-fdbcb12db6e0-host\") on node \"crc\" DevicePath \"\"" Nov 22 04:46:51 crc kubenswrapper[4699]: I1122 04:46:51.450876 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcc1b4ec-d5fe-4a55-bb8d-fdbcb12db6e0-kube-api-access-q6v88" (OuterVolumeSpecName: "kube-api-access-q6v88") pod "fcc1b4ec-d5fe-4a55-bb8d-fdbcb12db6e0" (UID: "fcc1b4ec-d5fe-4a55-bb8d-fdbcb12db6e0"). InnerVolumeSpecName "kube-api-access-q6v88". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:46:51 crc kubenswrapper[4699]: I1122 04:46:51.467385 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcc1b4ec-d5fe-4a55-bb8d-fdbcb12db6e0" path="/var/lib/kubelet/pods/fcc1b4ec-d5fe-4a55-bb8d-fdbcb12db6e0/volumes" Nov 22 04:46:51 crc kubenswrapper[4699]: I1122 04:46:51.543156 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6v88\" (UniqueName: \"kubernetes.io/projected/fcc1b4ec-d5fe-4a55-bb8d-fdbcb12db6e0-kube-api-access-q6v88\") on node \"crc\" DevicePath \"\"" Nov 22 04:46:52 crc kubenswrapper[4699]: I1122 04:46:52.210902 4699 scope.go:117] "RemoveContainer" containerID="cab6dc165a1cec837d1baf8141c2c47eeec419361fdec5aff51c243ba63ba76c" Nov 22 04:46:52 crc kubenswrapper[4699]: I1122 04:46:52.211029 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kg9ms/crc-debug-dqtzd" Nov 22 04:47:08 crc kubenswrapper[4699]: I1122 04:47:08.726071 4699 patch_prober.go:28] interesting pod/machine-config-daemon-kjwnt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:47:08 crc kubenswrapper[4699]: I1122 04:47:08.727777 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:47:11 crc kubenswrapper[4699]: I1122 04:47:11.487081 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7f47885746-l8msw_4c29f46a-251d-4422-a524-d5745603c348/barbican-api/0.log" Nov 22 04:47:11 crc kubenswrapper[4699]: I1122 04:47:11.867768 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7f47885746-l8msw_4c29f46a-251d-4422-a524-d5745603c348/barbican-api-log/0.log" Nov 22 04:47:11 crc kubenswrapper[4699]: I1122 04:47:11.880939 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-78b588d944-t7d25_da5bf8fa-2592-445a-acfc-56e044b4291c/barbican-keystone-listener/0.log" Nov 22 04:47:11 crc kubenswrapper[4699]: I1122 04:47:11.999307 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-78b588d944-t7d25_da5bf8fa-2592-445a-acfc-56e044b4291c/barbican-keystone-listener-log/0.log" Nov 22 04:47:12 crc kubenswrapper[4699]: I1122 04:47:12.124710 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-85df448b85-c7qlg_6434b63e-cd0f-4cc2-aa3e-463cbf9e7800/barbican-worker/0.log" Nov 22 04:47:12 crc kubenswrapper[4699]: I1122 04:47:12.142054 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-85df448b85-c7qlg_6434b63e-cd0f-4cc2-aa3e-463cbf9e7800/barbican-worker-log/0.log" Nov 22 04:47:12 crc kubenswrapper[4699]: I1122 04:47:12.341602 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_7858372b-0809-42b6-a01d-9db6f85d6c90/ceilometer-central-agent/0.log" Nov 22 04:47:12 crc kubenswrapper[4699]: I1122 04:47:12.354330 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_7858372b-0809-42b6-a01d-9db6f85d6c90/proxy-httpd/0.log" Nov 22 04:47:12 crc kubenswrapper[4699]: I1122 04:47:12.397270 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_7858372b-0809-42b6-a01d-9db6f85d6c90/ceilometer-notification-agent/0.log" Nov 22 04:47:12 crc kubenswrapper[4699]: I1122 04:47:12.461065 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_7858372b-0809-42b6-a01d-9db6f85d6c90/sg-core/0.log" Nov 22 04:47:12 crc kubenswrapper[4699]: I1122 04:47:12.568523 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a79d9f7b-c6a8-44bc-a2c7-65467492cff2/cinder-api-log/0.log" Nov 22 04:47:12 crc kubenswrapper[4699]: I1122 04:47:12.595844 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a79d9f7b-c6a8-44bc-a2c7-65467492cff2/cinder-api/0.log" Nov 22 04:47:12 crc kubenswrapper[4699]: I1122 04:47:12.796469 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a9b1d7c8-7353-480a-aa6f-7031b5228838/probe/0.log" Nov 22 04:47:12 crc kubenswrapper[4699]: I1122 04:47:12.842988 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a9b1d7c8-7353-480a-aa6f-7031b5228838/cinder-scheduler/0.log" Nov 22 04:47:12 crc kubenswrapper[4699]: I1122 04:47:12.939473 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-89c5cd4d5-q6kxn_c75eb722-836a-4b9f-ab34-1dc246154092/init/0.log" Nov 22 04:47:13 crc kubenswrapper[4699]: I1122 04:47:13.208510 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_3e4aa03f-40fe-45cb-8a03-445afd58f5b7/glance-httpd/0.log" Nov 22 04:47:13 crc kubenswrapper[4699]: I1122 04:47:13.224144 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-89c5cd4d5-q6kxn_c75eb722-836a-4b9f-ab34-1dc246154092/dnsmasq-dns/0.log" Nov 22 04:47:13 crc kubenswrapper[4699]: I1122 04:47:13.268183 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-89c5cd4d5-q6kxn_c75eb722-836a-4b9f-ab34-1dc246154092/init/0.log" Nov 22 04:47:13 crc kubenswrapper[4699]: I1122 04:47:13.496890 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_3e4aa03f-40fe-45cb-8a03-445afd58f5b7/glance-log/0.log" Nov 22 04:47:13 crc kubenswrapper[4699]: I1122 04:47:13.549141 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c0c750d0-0c65-4609-8ce0-5634ce490fc2/glance-log/0.log" Nov 22 04:47:13 crc kubenswrapper[4699]: I1122 04:47:13.555695 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c0c750d0-0c65-4609-8ce0-5634ce490fc2/glance-httpd/0.log" Nov 22 04:47:13 crc kubenswrapper[4699]: I1122 04:47:13.707177 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-554db96b96-4xcnr_e01db47e-4633-40f5-ad23-14867d89eba8/init/0.log" Nov 22 04:47:13 crc kubenswrapper[4699]: I1122 04:47:13.873869 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-554db96b96-4xcnr_e01db47e-4633-40f5-ad23-14867d89eba8/ironic-api-log/0.log" Nov 22 04:47:13 crc kubenswrapper[4699]: I1122 04:47:13.917061 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-554db96b96-4xcnr_e01db47e-4633-40f5-ad23-14867d89eba8/init/0.log" Nov 22 04:47:13 crc kubenswrapper[4699]: I1122 04:47:13.936932 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-554db96b96-4xcnr_e01db47e-4633-40f5-ad23-14867d89eba8/ironic-api/0.log" Nov 22 04:47:14 crc kubenswrapper[4699]: I1122 04:47:14.086416 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_6b0a42c8-e8a1-45b3-9f29-77459d98ea4d/init/0.log" Nov 22 04:47:14 crc kubenswrapper[4699]: I1122 04:47:14.233600 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_6b0a42c8-e8a1-45b3-9f29-77459d98ea4d/ironic-python-agent-init/0.log" Nov 22 04:47:14 crc kubenswrapper[4699]: I1122 04:47:14.259400 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_6b0a42c8-e8a1-45b3-9f29-77459d98ea4d/init/0.log" Nov 22 04:47:14 crc kubenswrapper[4699]: I1122 04:47:14.339494 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_6b0a42c8-e8a1-45b3-9f29-77459d98ea4d/ironic-python-agent-init/0.log" Nov 22 04:47:14 crc kubenswrapper[4699]: I1122 04:47:14.625808 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_6b0a42c8-e8a1-45b3-9f29-77459d98ea4d/init/0.log" Nov 22 04:47:14 crc kubenswrapper[4699]: I1122 04:47:14.640350 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_6b0a42c8-e8a1-45b3-9f29-77459d98ea4d/ironic-python-agent-init/0.log" Nov 22 04:47:15 crc kubenswrapper[4699]: I1122 04:47:15.133956 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_6b0a42c8-e8a1-45b3-9f29-77459d98ea4d/ironic-python-agent-init/0.log" Nov 22 04:47:15 crc kubenswrapper[4699]: I1122 04:47:15.179135 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_6b0a42c8-e8a1-45b3-9f29-77459d98ea4d/init/0.log" Nov 22 04:47:15 crc kubenswrapper[4699]: I1122 04:47:15.318867 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_6b0a42c8-e8a1-45b3-9f29-77459d98ea4d/pxe-init/0.log" Nov 22 04:47:15 crc kubenswrapper[4699]: I1122 04:47:15.380014 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_6b0a42c8-e8a1-45b3-9f29-77459d98ea4d/httpboot/0.log" Nov 22 04:47:15 crc kubenswrapper[4699]: I1122 04:47:15.564702 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_6b0a42c8-e8a1-45b3-9f29-77459d98ea4d/ramdisk-logs/0.log" Nov 22 04:47:15 crc kubenswrapper[4699]: I1122 04:47:15.583860 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_6b0a42c8-e8a1-45b3-9f29-77459d98ea4d/ironic-conductor/0.log" Nov 22 04:47:15 crc kubenswrapper[4699]: I1122 04:47:15.806632 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-db-sync-bd6j2_19251598-5cdb-4e4f-9eb7-05cd21d988fb/init/0.log" Nov 22 04:47:16 crc kubenswrapper[4699]: I1122 04:47:16.012169 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-db-sync-bd6j2_19251598-5cdb-4e4f-9eb7-05cd21d988fb/ironic-db-sync/0.log" Nov 22 04:47:16 crc kubenswrapper[4699]: I1122 04:47:16.021204 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-db-sync-bd6j2_19251598-5cdb-4e4f-9eb7-05cd21d988fb/init/0.log" Nov 22 04:47:16 crc kubenswrapper[4699]: I1122 04:47:16.060708 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_6b0a42c8-e8a1-45b3-9f29-77459d98ea4d/pxe-init/0.log" Nov 22 04:47:16 crc kubenswrapper[4699]: I1122 04:47:16.093952 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_6b0a42c8-e8a1-45b3-9f29-77459d98ea4d/pxe-init/0.log" Nov 22 04:47:16 crc kubenswrapper[4699]: I1122 04:47:16.227939 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_3c8864ff-6365-44c6-8fe0-134c7f25b176/ironic-python-agent-init/0.log" Nov 22 04:47:16 crc kubenswrapper[4699]: I1122 04:47:16.434674 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_3c8864ff-6365-44c6-8fe0-134c7f25b176/inspector-pxe-init/0.log" Nov 22 04:47:16 crc kubenswrapper[4699]: I1122 04:47:16.439017 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_6b0a42c8-e8a1-45b3-9f29-77459d98ea4d/pxe-init/0.log" Nov 22 04:47:16 crc kubenswrapper[4699]: I1122 04:47:16.452980 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_3c8864ff-6365-44c6-8fe0-134c7f25b176/inspector-pxe-init/0.log" Nov 22 04:47:16 crc kubenswrapper[4699]: I1122 04:47:16.467292 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_3c8864ff-6365-44c6-8fe0-134c7f25b176/ironic-python-agent-init/0.log" Nov 22 04:47:16 crc kubenswrapper[4699]: I1122 04:47:16.599340 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_3c8864ff-6365-44c6-8fe0-134c7f25b176/inspector-httpboot/0.log" Nov 22 04:47:16 crc kubenswrapper[4699]: I1122 04:47:16.626824 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_3c8864ff-6365-44c6-8fe0-134c7f25b176/ironic-python-agent-init/0.log" Nov 22 04:47:16 crc kubenswrapper[4699]: I1122 04:47:16.673631 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_3c8864ff-6365-44c6-8fe0-134c7f25b176/ironic-inspector-httpd/0.log" Nov 22 04:47:16 crc kubenswrapper[4699]: I1122 04:47:16.675695 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_3c8864ff-6365-44c6-8fe0-134c7f25b176/ironic-inspector/0.log" Nov 22 04:47:16 crc kubenswrapper[4699]: I1122 04:47:16.704869 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_3c8864ff-6365-44c6-8fe0-134c7f25b176/inspector-pxe-init/0.log" Nov 22 04:47:16 crc kubenswrapper[4699]: I1122 04:47:16.779152 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_3c8864ff-6365-44c6-8fe0-134c7f25b176/ramdisk-logs/0.log" Nov 22 04:47:16 crc kubenswrapper[4699]: I1122 04:47:16.863646 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-db-sync-kdn8x_57ead407-5bf6-4cc4-ac17-e939d329f220/ironic-inspector-db-sync/0.log" Nov 22 04:47:16 crc kubenswrapper[4699]: I1122 04:47:16.953119 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-neutron-agent-65957c9c4f-4rj2b_474af2c7-c72f-4420-94a9-4876e0dbd68e/ironic-neutron-agent/2.log" Nov 22 04:47:17 crc kubenswrapper[4699]: I1122 04:47:17.029860 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-neutron-agent-65957c9c4f-4rj2b_474af2c7-c72f-4420-94a9-4876e0dbd68e/ironic-neutron-agent/1.log" Nov 22 04:47:17 crc kubenswrapper[4699]: I1122 04:47:17.189154 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_ef3b350d-96dc-4b7f-bc63-586d92e57da6/kube-state-metrics/0.log" Nov 22 04:47:17 crc kubenswrapper[4699]: I1122 04:47:17.240782 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6bf6559788-s4hk6_15aff0a7-6c4f-449c-addf-6cea805a4820/keystone-api/0.log" Nov 22 04:47:17 crc kubenswrapper[4699]: I1122 04:47:17.557507 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-566cbdbc45-ld9jb_4c5bbb47-8099-4bbb-b8a0-d2a56265522b/neutron-httpd/0.log" Nov 22 04:47:17 crc kubenswrapper[4699]: I1122 04:47:17.566601 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-566cbdbc45-ld9jb_4c5bbb47-8099-4bbb-b8a0-d2a56265522b/neutron-api/0.log" Nov 22 04:47:17 crc kubenswrapper[4699]: I1122 04:47:17.901266 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_6c614af4-7edb-4d51-9b42-5826d1cf656b/nova-api-log/0.log" Nov 22 04:47:18 crc kubenswrapper[4699]: I1122 04:47:18.018413 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_8624913e-b73b-41b8-ac5e-64d9114de859/nova-cell0-conductor-conductor/0.log" Nov 22 04:47:18 crc kubenswrapper[4699]: I1122 04:47:18.033919 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_6c614af4-7edb-4d51-9b42-5826d1cf656b/nova-api-api/0.log" Nov 22 04:47:18 crc kubenswrapper[4699]: I1122 04:47:18.224752 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_88bb930e-50f0-4126-8410-cc3dbb3e864b/nova-cell1-conductor-conductor/0.log" Nov 22 04:47:18 crc kubenswrapper[4699]: I1122 04:47:18.345404 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_a36302e5-6f2a-4c2a-80db-9d02fea03316/nova-cell1-novncproxy-novncproxy/0.log" Nov 22 04:47:18 crc kubenswrapper[4699]: I1122 04:47:18.494757 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_da5d8c3b-bc84-4687-8f1d-c4763aba383c/nova-metadata-log/0.log" Nov 22 04:47:18 crc kubenswrapper[4699]: I1122 04:47:18.749201 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_6bdcc9f1-da80-479b-b5d2-f4487ed993c7/nova-scheduler-scheduler/0.log" Nov 22 04:47:18 crc kubenswrapper[4699]: I1122 04:47:18.749667 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e74585bc-d1cf-473d-95ca-12c816ff0020/mysql-bootstrap/0.log" Nov 22 04:47:18 crc kubenswrapper[4699]: I1122 04:47:18.922360 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e74585bc-d1cf-473d-95ca-12c816ff0020/mysql-bootstrap/0.log" Nov 22 04:47:18 crc kubenswrapper[4699]: I1122 04:47:18.966633 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_da5d8c3b-bc84-4687-8f1d-c4763aba383c/nova-metadata-metadata/0.log" Nov 22 04:47:18 crc kubenswrapper[4699]: I1122 04:47:18.974754 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e74585bc-d1cf-473d-95ca-12c816ff0020/galera/0.log" Nov 22 04:47:19 crc kubenswrapper[4699]: I1122 04:47:19.126344 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_57084326-d72e-40cb-9905-ca75d50f51e3/mysql-bootstrap/0.log" Nov 22 04:47:19 crc kubenswrapper[4699]: I1122 04:47:19.306628 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_b3f3d84b-ad88-4145-9e18-b2baa8eff9c4/openstackclient/0.log" Nov 22 04:47:19 crc kubenswrapper[4699]: I1122 04:47:19.326027 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_57084326-d72e-40cb-9905-ca75d50f51e3/mysql-bootstrap/0.log" Nov 22 04:47:19 crc kubenswrapper[4699]: I1122 04:47:19.331249 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_57084326-d72e-40cb-9905-ca75d50f51e3/galera/0.log" Nov 22 04:47:19 crc kubenswrapper[4699]: I1122 04:47:19.562577 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-x9j67_14d98ff4-07de-4764-a1a6-238316e83ee3/openstack-network-exporter/0.log" Nov 22 04:47:19 crc kubenswrapper[4699]: I1122 04:47:19.581277 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-j7b96_55053527-f2d2-4e44-8a9c-153b74ef3605/ovsdb-server-init/0.log" Nov 22 04:47:19 crc kubenswrapper[4699]: I1122 04:47:19.756944 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-j7b96_55053527-f2d2-4e44-8a9c-153b74ef3605/ovsdb-server-init/0.log" Nov 22 04:47:19 crc kubenswrapper[4699]: I1122 04:47:19.776937 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-j7b96_55053527-f2d2-4e44-8a9c-153b74ef3605/ovs-vswitchd/0.log" Nov 22 04:47:19 crc kubenswrapper[4699]: I1122 04:47:19.779441 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-j7b96_55053527-f2d2-4e44-8a9c-153b74ef3605/ovsdb-server/0.log" Nov 22 04:47:19 crc kubenswrapper[4699]: I1122 04:47:19.987059 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-s7mlz_0311366c-c8c7-449c-b617-213a4d87de00/ovn-controller/0.log" Nov 22 04:47:20 crc kubenswrapper[4699]: I1122 04:47:20.000139 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6022714c-eabe-49a9-b794-0b7a0097b816/openstack-network-exporter/0.log" Nov 22 04:47:20 crc kubenswrapper[4699]: I1122 04:47:20.166772 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6022714c-eabe-49a9-b794-0b7a0097b816/ovn-northd/0.log" Nov 22 04:47:20 crc kubenswrapper[4699]: I1122 04:47:20.205936 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa/ovsdbserver-nb/0.log" Nov 22 04:47:20 crc kubenswrapper[4699]: I1122 04:47:20.247747 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_fb62cd5c-3d93-4b7d-810c-0ee46c6f90fa/openstack-network-exporter/0.log" Nov 22 04:47:20 crc kubenswrapper[4699]: I1122 04:47:20.387072 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_3e31d684-0292-4e13-8bce-9af3fbcb09cb/openstack-network-exporter/0.log" Nov 22 04:47:20 crc kubenswrapper[4699]: I1122 04:47:20.470313 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_3e31d684-0292-4e13-8bce-9af3fbcb09cb/ovsdbserver-sb/0.log" Nov 22 04:47:20 crc kubenswrapper[4699]: I1122 04:47:20.635583 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-64688bf4db-vwnwg_11aab908-3152-4d7b-bfb3-b4f3e04bb7a8/placement-api/0.log" Nov 22 04:47:20 crc kubenswrapper[4699]: I1122 04:47:20.710985 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_522fc300-2659-442f-9311-65aa82b05e99/setup-container/0.log" Nov 22 04:47:20 crc kubenswrapper[4699]: I1122 04:47:20.715934 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-64688bf4db-vwnwg_11aab908-3152-4d7b-bfb3-b4f3e04bb7a8/placement-log/0.log" Nov 22 04:47:20 crc kubenswrapper[4699]: I1122 04:47:20.964904 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_522fc300-2659-442f-9311-65aa82b05e99/setup-container/0.log" Nov 22 04:47:21 crc kubenswrapper[4699]: I1122 04:47:21.055966 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8/setup-container/0.log" Nov 22 04:47:21 crc kubenswrapper[4699]: I1122 04:47:21.080048 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_522fc300-2659-442f-9311-65aa82b05e99/rabbitmq/0.log" Nov 22 04:47:21 crc kubenswrapper[4699]: I1122 04:47:21.216923 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8/setup-container/0.log" Nov 22 04:47:21 crc kubenswrapper[4699]: I1122 04:47:21.279801 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_2e51fd04-9448-4f4c-a5f4-9e2cfb6d3de8/rabbitmq/0.log" Nov 22 04:47:21 crc kubenswrapper[4699]: I1122 04:47:21.443135 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5997b85577-gkwmz_f3eaea68-e2a0-4b59-961e-eebded9815b1/proxy-server/0.log" Nov 22 04:47:21 crc kubenswrapper[4699]: I1122 04:47:21.463776 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5997b85577-gkwmz_f3eaea68-e2a0-4b59-961e-eebded9815b1/proxy-httpd/0.log" Nov 22 04:47:21 crc kubenswrapper[4699]: I1122 04:47:21.607163 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-t9rdp_ed96c0b0-7b76-4f03-b352-461405bbfb23/swift-ring-rebalance/0.log" Nov 22 04:47:21 crc kubenswrapper[4699]: I1122 04:47:21.662450 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ff24634a-9171-4a5c-b045-4c653e032c18/account-auditor/0.log" Nov 22 04:47:21 crc kubenswrapper[4699]: I1122 04:47:21.757230 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ff24634a-9171-4a5c-b045-4c653e032c18/account-reaper/0.log" Nov 22 04:47:21 crc kubenswrapper[4699]: I1122 04:47:21.838200 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ff24634a-9171-4a5c-b045-4c653e032c18/account-replicator/0.log" Nov 22 04:47:21 crc kubenswrapper[4699]: I1122 04:47:21.881334 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ff24634a-9171-4a5c-b045-4c653e032c18/account-server/0.log" Nov 22 04:47:21 crc kubenswrapper[4699]: I1122 04:47:21.954901 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ff24634a-9171-4a5c-b045-4c653e032c18/container-replicator/0.log" Nov 22 04:47:21 crc kubenswrapper[4699]: I1122 04:47:21.982531 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ff24634a-9171-4a5c-b045-4c653e032c18/container-auditor/0.log" Nov 22 04:47:22 crc kubenswrapper[4699]: I1122 04:47:22.069492 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ff24634a-9171-4a5c-b045-4c653e032c18/container-server/0.log" Nov 22 04:47:22 crc kubenswrapper[4699]: I1122 04:47:22.075628 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ff24634a-9171-4a5c-b045-4c653e032c18/container-updater/0.log" Nov 22 04:47:22 crc kubenswrapper[4699]: I1122 04:47:22.156686 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ff24634a-9171-4a5c-b045-4c653e032c18/object-auditor/0.log" Nov 22 04:47:22 crc kubenswrapper[4699]: I1122 04:47:22.169110 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ff24634a-9171-4a5c-b045-4c653e032c18/object-expirer/0.log" Nov 22 04:47:22 crc kubenswrapper[4699]: I1122 04:47:22.236445 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ff24634a-9171-4a5c-b045-4c653e032c18/object-server/0.log" Nov 22 04:47:22 crc kubenswrapper[4699]: I1122 04:47:22.339697 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ff24634a-9171-4a5c-b045-4c653e032c18/object-replicator/0.log" Nov 22 04:47:22 crc kubenswrapper[4699]: I1122 04:47:22.354401 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ff24634a-9171-4a5c-b045-4c653e032c18/rsync/0.log" Nov 22 04:47:22 crc kubenswrapper[4699]: I1122 04:47:22.366450 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ff24634a-9171-4a5c-b045-4c653e032c18/object-updater/0.log" Nov 22 04:47:22 crc kubenswrapper[4699]: I1122 04:47:22.489902 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ff24634a-9171-4a5c-b045-4c653e032c18/swift-recon-cron/0.log" Nov 22 04:47:26 crc kubenswrapper[4699]: I1122 04:47:26.654678 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_02e377d7-9e5a-45ec-9460-16af64ce3db5/memcached/0.log" Nov 22 04:47:38 crc kubenswrapper[4699]: I1122 04:47:38.725907 4699 patch_prober.go:28] interesting pod/machine-config-daemon-kjwnt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 04:47:38 crc kubenswrapper[4699]: I1122 04:47:38.726678 4699 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 04:47:38 crc kubenswrapper[4699]: I1122 04:47:38.726722 4699 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" Nov 22 04:47:38 crc kubenswrapper[4699]: I1122 04:47:38.727417 4699 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"09c1fed9b7151bf7233b81b1d357689f9d97efd829231216c50a7352fe56ab7b"} pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 04:47:38 crc kubenswrapper[4699]: I1122 04:47:38.727485 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerName="machine-config-daemon" containerID="cri-o://09c1fed9b7151bf7233b81b1d357689f9d97efd829231216c50a7352fe56ab7b" gracePeriod=600 Nov 22 04:47:38 crc kubenswrapper[4699]: E1122 04:47:38.859826 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kjwnt_openshift-machine-config-operator(41bdbae2-706a-4f84-9f56-5a42aec77762)\"" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" Nov 22 04:47:39 crc kubenswrapper[4699]: I1122 04:47:39.624228 4699 generic.go:334] "Generic (PLEG): container finished" podID="41bdbae2-706a-4f84-9f56-5a42aec77762" containerID="09c1fed9b7151bf7233b81b1d357689f9d97efd829231216c50a7352fe56ab7b" exitCode=0 Nov 22 04:47:39 crc kubenswrapper[4699]: I1122 04:47:39.624297 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" event={"ID":"41bdbae2-706a-4f84-9f56-5a42aec77762","Type":"ContainerDied","Data":"09c1fed9b7151bf7233b81b1d357689f9d97efd829231216c50a7352fe56ab7b"} Nov 22 04:47:39 crc kubenswrapper[4699]: I1122 04:47:39.624624 4699 scope.go:117] "RemoveContainer" containerID="3c295cef383260f09250dce73dcd78e9753722e5852bd1dba0e9d0043c5a2324" Nov 22 04:47:39 crc kubenswrapper[4699]: I1122 04:47:39.625244 4699 scope.go:117] "RemoveContainer" containerID="09c1fed9b7151bf7233b81b1d357689f9d97efd829231216c50a7352fe56ab7b" Nov 22 04:47:39 crc kubenswrapper[4699]: E1122 04:47:39.625517 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kjwnt_openshift-machine-config-operator(41bdbae2-706a-4f84-9f56-5a42aec77762)\"" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" Nov 22 04:47:43 crc kubenswrapper[4699]: I1122 04:47:43.377756 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0702d9261541ca7ab33cc0f5bb569a6098e591a9e02db10dc12f9a2708fnb5w_d98e67ec-e732-4646-859f-5dcf61d03def/util/0.log" Nov 22 04:47:43 crc kubenswrapper[4699]: I1122 04:47:43.596294 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0702d9261541ca7ab33cc0f5bb569a6098e591a9e02db10dc12f9a2708fnb5w_d98e67ec-e732-4646-859f-5dcf61d03def/pull/0.log" Nov 22 04:47:43 crc kubenswrapper[4699]: I1122 04:47:43.612686 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0702d9261541ca7ab33cc0f5bb569a6098e591a9e02db10dc12f9a2708fnb5w_d98e67ec-e732-4646-859f-5dcf61d03def/pull/0.log" Nov 22 04:47:43 crc kubenswrapper[4699]: I1122 04:47:43.631347 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0702d9261541ca7ab33cc0f5bb569a6098e591a9e02db10dc12f9a2708fnb5w_d98e67ec-e732-4646-859f-5dcf61d03def/util/0.log" Nov 22 04:47:43 crc kubenswrapper[4699]: I1122 04:47:43.913418 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0702d9261541ca7ab33cc0f5bb569a6098e591a9e02db10dc12f9a2708fnb5w_d98e67ec-e732-4646-859f-5dcf61d03def/util/0.log" Nov 22 04:47:43 crc kubenswrapper[4699]: I1122 04:47:43.928719 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0702d9261541ca7ab33cc0f5bb569a6098e591a9e02db10dc12f9a2708fnb5w_d98e67ec-e732-4646-859f-5dcf61d03def/pull/0.log" Nov 22 04:47:44 crc kubenswrapper[4699]: I1122 04:47:44.029214 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0702d9261541ca7ab33cc0f5bb569a6098e591a9e02db10dc12f9a2708fnb5w_d98e67ec-e732-4646-859f-5dcf61d03def/extract/0.log" Nov 22 04:47:44 crc kubenswrapper[4699]: I1122 04:47:44.086796 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-75fb479bcc-qcvls_5cda144e-7465-4060-945a-89e3d288c551/kube-rbac-proxy/0.log" Nov 22 04:47:44 crc kubenswrapper[4699]: I1122 04:47:44.178609 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-75fb479bcc-qcvls_5cda144e-7465-4060-945a-89e3d288c551/manager/0.log" Nov 22 04:47:44 crc kubenswrapper[4699]: I1122 04:47:44.207889 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6498cbf48f-pq7wj_455f990d-3a21-4c84-8a9d-e4a4af10c47f/kube-rbac-proxy/0.log" Nov 22 04:47:44 crc kubenswrapper[4699]: I1122 04:47:44.323849 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6498cbf48f-pq7wj_455f990d-3a21-4c84-8a9d-e4a4af10c47f/manager/0.log" Nov 22 04:47:44 crc kubenswrapper[4699]: I1122 04:47:44.420858 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-767ccfd65f-6fd98_b4a26451-a994-4295-b354-46babc06a258/manager/0.log" Nov 22 04:47:44 crc kubenswrapper[4699]: I1122 04:47:44.444012 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-767ccfd65f-6fd98_b4a26451-a994-4295-b354-46babc06a258/kube-rbac-proxy/0.log" Nov 22 04:47:44 crc kubenswrapper[4699]: I1122 04:47:44.659907 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7969689c84-thd4s_18c1e29a-63b8-4973-92a9-87c5b0301565/kube-rbac-proxy/0.log" Nov 22 04:47:44 crc kubenswrapper[4699]: I1122 04:47:44.682182 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-56f54d6746-k9wzx_d0b96c5e-2c37-4f90-b933-2b8f8cbdfaf3/kube-rbac-proxy/0.log" Nov 22 04:47:44 crc kubenswrapper[4699]: I1122 04:47:44.736880 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7969689c84-thd4s_18c1e29a-63b8-4973-92a9-87c5b0301565/manager/0.log" Nov 22 04:47:44 crc kubenswrapper[4699]: I1122 04:47:44.916306 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-598f69df5d-kd4w8_34a9f105-8024-4cc0-9ad2-14029731110d/kube-rbac-proxy/0.log" Nov 22 04:47:44 crc kubenswrapper[4699]: I1122 04:47:44.972359 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-56f54d6746-k9wzx_d0b96c5e-2c37-4f90-b933-2b8f8cbdfaf3/manager/0.log" Nov 22 04:47:44 crc kubenswrapper[4699]: I1122 04:47:44.972594 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-598f69df5d-kd4w8_34a9f105-8024-4cc0-9ad2-14029731110d/manager/0.log" Nov 22 04:47:45 crc kubenswrapper[4699]: I1122 04:47:45.298723 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7875d8bb94-q9xzz_c6fc85a2-ca9a-4d87-811a-4ff8fcdcf477/kube-rbac-proxy/0.log" Nov 22 04:47:45 crc kubenswrapper[4699]: I1122 04:47:45.395306 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5d95d484b9-g8rz2_aaa73391-c097-4428-a43d-a5a4c1469419/kube-rbac-proxy/0.log" Nov 22 04:47:45 crc kubenswrapper[4699]: I1122 04:47:45.469485 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7875d8bb94-q9xzz_c6fc85a2-ca9a-4d87-811a-4ff8fcdcf477/manager/0.log" Nov 22 04:47:45 crc kubenswrapper[4699]: I1122 04:47:45.543645 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5d95d484b9-g8rz2_aaa73391-c097-4428-a43d-a5a4c1469419/manager/0.log" Nov 22 04:47:45 crc kubenswrapper[4699]: I1122 04:47:45.596637 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7454b96578-2dh9g_8ea3fa32-8451-4f8a-b395-98ce1382e116/kube-rbac-proxy/0.log" Nov 22 04:47:45 crc kubenswrapper[4699]: I1122 04:47:45.766399 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-58f887965d-qsszz_072faef9-c4a0-4bf9-84a8-fadca8945449/kube-rbac-proxy/0.log" Nov 22 04:47:45 crc kubenswrapper[4699]: I1122 04:47:45.790545 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7454b96578-2dh9g_8ea3fa32-8451-4f8a-b395-98ce1382e116/manager/0.log" Nov 22 04:47:45 crc kubenswrapper[4699]: I1122 04:47:45.805564 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-58f887965d-qsszz_072faef9-c4a0-4bf9-84a8-fadca8945449/manager/0.log" Nov 22 04:47:45 crc kubenswrapper[4699]: I1122 04:47:45.979056 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-54b5986bb8-zd2r8_aca6ad44-aa04-4178-ab59-bfdec68e49e7/kube-rbac-proxy/0.log" Nov 22 04:47:46 crc kubenswrapper[4699]: I1122 04:47:46.057284 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-54b5986bb8-zd2r8_aca6ad44-aa04-4178-ab59-bfdec68e49e7/manager/0.log" Nov 22 04:47:46 crc kubenswrapper[4699]: I1122 04:47:46.188343 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-78bd47f458-v289m_91f34dc0-cd1e-4f25-90e8-cbcb4ae2d930/kube-rbac-proxy/0.log" Nov 22 04:47:46 crc kubenswrapper[4699]: I1122 04:47:46.227118 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-78bd47f458-v289m_91f34dc0-cd1e-4f25-90e8-cbcb4ae2d930/manager/0.log" Nov 22 04:47:46 crc kubenswrapper[4699]: I1122 04:47:46.273152 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-cfbb9c588-qlcpr_e8f34ea0-681d-4a19-b9c9-0c230a7261e3/kube-rbac-proxy/0.log" Nov 22 04:47:46 crc kubenswrapper[4699]: I1122 04:47:46.474528 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-cfbb9c588-qlcpr_e8f34ea0-681d-4a19-b9c9-0c230a7261e3/manager/0.log" Nov 22 04:47:46 crc kubenswrapper[4699]: I1122 04:47:46.521438 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-54cfbf4c7d-7rck7_36905f20-0246-46f5-921a-2d18b2db8bdd/manager/0.log" Nov 22 04:47:46 crc kubenswrapper[4699]: I1122 04:47:46.636875 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-54cfbf4c7d-7rck7_36905f20-0246-46f5-921a-2d18b2db8bdd/kube-rbac-proxy/0.log" Nov 22 04:47:46 crc kubenswrapper[4699]: I1122 04:47:46.751766 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-8c7444f48-qcxmj_913c840a-25a6-46f9-bd06-e379438a5292/manager/0.log" Nov 22 04:47:46 crc kubenswrapper[4699]: I1122 04:47:46.780387 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-8c7444f48-qcxmj_913c840a-25a6-46f9-bd06-e379438a5292/kube-rbac-proxy/0.log" Nov 22 04:47:47 crc kubenswrapper[4699]: I1122 04:47:47.058016 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-655bc68c75-ttb9l_3a16ffb8-61fd-4e05-ac7b-277eb20e4f4c/kube-rbac-proxy/0.log" Nov 22 04:47:47 crc kubenswrapper[4699]: I1122 04:47:47.065960 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-66b5f67bb4-9h4ls_07e128b3-5973-437b-b7ec-80177dacf14f/kube-rbac-proxy/0.log" Nov 22 04:47:47 crc kubenswrapper[4699]: I1122 04:47:47.365493 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-wnnv6_0bb24428-cae6-49f4-b4d7-5a33488d5e2e/registry-server/0.log" Nov 22 04:47:47 crc kubenswrapper[4699]: I1122 04:47:47.426448 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-655bc68c75-ttb9l_3a16ffb8-61fd-4e05-ac7b-277eb20e4f4c/operator/0.log" Nov 22 04:47:47 crc kubenswrapper[4699]: I1122 04:47:47.602529 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-54fc5f65b7-j5b4z_7199810d-9e13-4ed5-a4bc-46c874551678/kube-rbac-proxy/0.log" Nov 22 04:47:47 crc kubenswrapper[4699]: I1122 04:47:47.669225 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5b797b8dff-fpcpg_0669449f-4e6b-4ab7-90ee-f8d93286db7a/kube-rbac-proxy/0.log" Nov 22 04:47:47 crc kubenswrapper[4699]: I1122 04:47:47.705187 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-54fc5f65b7-j5b4z_7199810d-9e13-4ed5-a4bc-46c874551678/manager/0.log" Nov 22 04:47:47 crc kubenswrapper[4699]: I1122 04:47:47.744625 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-66b5f67bb4-9h4ls_07e128b3-5973-437b-b7ec-80177dacf14f/manager/0.log" Nov 22 04:47:47 crc kubenswrapper[4699]: I1122 04:47:47.812783 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5b797b8dff-fpcpg_0669449f-4e6b-4ab7-90ee-f8d93286db7a/manager/0.log" Nov 22 04:47:47 crc kubenswrapper[4699]: I1122 04:47:47.893264 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-rhx7c_4fb724ba-7502-41eb-aab0-40eacbcd652e/operator/0.log" Nov 22 04:47:47 crc kubenswrapper[4699]: I1122 04:47:47.908834 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-d656998f4-fcjt6_a67d0761-3d62-4e25-80bc-cf6fac86cf0b/kube-rbac-proxy/0.log" Nov 22 04:47:48 crc kubenswrapper[4699]: I1122 04:47:48.030062 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-d656998f4-fcjt6_a67d0761-3d62-4e25-80bc-cf6fac86cf0b/manager/0.log" Nov 22 04:47:48 crc kubenswrapper[4699]: I1122 04:47:48.100965 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6d4bf84b58-lxncl_0dc61afc-07cb-46af-afb8-4c0bf3bc84f0/kube-rbac-proxy/0.log" Nov 22 04:47:48 crc kubenswrapper[4699]: I1122 04:47:48.126850 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6d4bf84b58-lxncl_0dc61afc-07cb-46af-afb8-4c0bf3bc84f0/manager/0.log" Nov 22 04:47:48 crc kubenswrapper[4699]: I1122 04:47:48.224245 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-b4c496f69-j5w8t_1cf7d81b-c0df-48d7-9b01-b7185a803ac6/manager/0.log" Nov 22 04:47:48 crc kubenswrapper[4699]: I1122 04:47:48.231880 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-b4c496f69-j5w8t_1cf7d81b-c0df-48d7-9b01-b7185a803ac6/kube-rbac-proxy/0.log" Nov 22 04:47:48 crc kubenswrapper[4699]: I1122 04:47:48.338463 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-8c6448b9f-t4rhb_482b57cb-741a-4062-9479-2a41febc67af/kube-rbac-proxy/0.log" Nov 22 04:47:48 crc kubenswrapper[4699]: I1122 04:47:48.344028 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-8c6448b9f-t4rhb_482b57cb-741a-4062-9479-2a41febc67af/manager/0.log" Nov 22 04:47:52 crc kubenswrapper[4699]: I1122 04:47:52.448346 4699 scope.go:117] "RemoveContainer" containerID="09c1fed9b7151bf7233b81b1d357689f9d97efd829231216c50a7352fe56ab7b" Nov 22 04:47:52 crc kubenswrapper[4699]: E1122 04:47:52.449209 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kjwnt_openshift-machine-config-operator(41bdbae2-706a-4f84-9f56-5a42aec77762)\"" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" Nov 22 04:48:04 crc kubenswrapper[4699]: I1122 04:48:04.288779 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-w5jfl_7343df3b-7616-42dd-8e27-5f9a2031a8d9/control-plane-machine-set-operator/0.log" Nov 22 04:48:04 crc kubenswrapper[4699]: I1122 04:48:04.419856 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-jlk5m_07ed42bd-25e2-43de-bbd7-431ab818b761/kube-rbac-proxy/0.log" Nov 22 04:48:04 crc kubenswrapper[4699]: I1122 04:48:04.464838 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-jlk5m_07ed42bd-25e2-43de-bbd7-431ab818b761/machine-api-operator/0.log" Nov 22 04:48:06 crc kubenswrapper[4699]: I1122 04:48:06.448218 4699 scope.go:117] "RemoveContainer" containerID="09c1fed9b7151bf7233b81b1d357689f9d97efd829231216c50a7352fe56ab7b" Nov 22 04:48:06 crc kubenswrapper[4699]: E1122 04:48:06.448760 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kjwnt_openshift-machine-config-operator(41bdbae2-706a-4f84-9f56-5a42aec77762)\"" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" Nov 22 04:48:16 crc kubenswrapper[4699]: I1122 04:48:16.909394 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-ggffg_2620c9cc-4041-49f0-bd0a-2b227e8214d6/cert-manager-controller/0.log" Nov 22 04:48:17 crc kubenswrapper[4699]: I1122 04:48:17.096919 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-mcllz_69760a96-2f1a-4eca-8bc1-9734e255c260/cert-manager-cainjector/0.log" Nov 22 04:48:17 crc kubenswrapper[4699]: I1122 04:48:17.146187 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-x8jj9_bdbcc3a3-05f4-4840-98d7-1e2417a9ad0b/cert-manager-webhook/0.log" Nov 22 04:48:17 crc kubenswrapper[4699]: I1122 04:48:17.448264 4699 scope.go:117] "RemoveContainer" containerID="09c1fed9b7151bf7233b81b1d357689f9d97efd829231216c50a7352fe56ab7b" Nov 22 04:48:17 crc kubenswrapper[4699]: E1122 04:48:17.449037 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kjwnt_openshift-machine-config-operator(41bdbae2-706a-4f84-9f56-5a42aec77762)\"" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" Nov 22 04:48:29 crc kubenswrapper[4699]: I1122 04:48:29.015534 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5874bd7bc5-r7xcp_55b4d3a4-d0be-4184-b9ee-efedf1c27608/nmstate-console-plugin/0.log" Nov 22 04:48:29 crc kubenswrapper[4699]: I1122 04:48:29.091514 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-429gd_558d67eb-e4e4-46ca-bc65-8e4d568f4037/nmstate-handler/0.log" Nov 22 04:48:29 crc kubenswrapper[4699]: I1122 04:48:29.129890 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-5dcf9c57c5-r67mk_0e1f9c73-89fc-4ab2-aca3-004315167c79/kube-rbac-proxy/0.log" Nov 22 04:48:29 crc kubenswrapper[4699]: I1122 04:48:29.220291 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-5dcf9c57c5-r67mk_0e1f9c73-89fc-4ab2-aca3-004315167c79/nmstate-metrics/0.log" Nov 22 04:48:29 crc kubenswrapper[4699]: I1122 04:48:29.339654 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-557fdffb88-6nj2r_a7e3ac11-e456-48a4-ad00-114a41462661/nmstate-operator/0.log" Nov 22 04:48:29 crc kubenswrapper[4699]: I1122 04:48:29.437278 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6b89b748d8-7qwpr_0d107f7a-e965-41e9-8ceb-5c5ac1c3b530/nmstate-webhook/0.log" Nov 22 04:48:29 crc kubenswrapper[4699]: I1122 04:48:29.448162 4699 scope.go:117] "RemoveContainer" containerID="09c1fed9b7151bf7233b81b1d357689f9d97efd829231216c50a7352fe56ab7b" Nov 22 04:48:29 crc kubenswrapper[4699]: E1122 04:48:29.448730 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kjwnt_openshift-machine-config-operator(41bdbae2-706a-4f84-9f56-5a42aec77762)\"" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" Nov 22 04:48:42 crc kubenswrapper[4699]: I1122 04:48:42.447909 4699 scope.go:117] "RemoveContainer" containerID="09c1fed9b7151bf7233b81b1d357689f9d97efd829231216c50a7352fe56ab7b" Nov 22 04:48:42 crc kubenswrapper[4699]: E1122 04:48:42.449546 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kjwnt_openshift-machine-config-operator(41bdbae2-706a-4f84-9f56-5a42aec77762)\"" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" Nov 22 04:48:42 crc kubenswrapper[4699]: I1122 04:48:42.651959 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6c7b4b5f48-g47xp_dae7dee7-2390-47bc-83c9-488f48a4cc90/kube-rbac-proxy/0.log" Nov 22 04:48:42 crc kubenswrapper[4699]: I1122 04:48:42.797748 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6c7b4b5f48-g47xp_dae7dee7-2390-47bc-83c9-488f48a4cc90/controller/0.log" Nov 22 04:48:42 crc kubenswrapper[4699]: I1122 04:48:42.912285 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjgk6_2840ab61-4c34-4132-970e-c6d8c615c2bd/cp-frr-files/0.log" Nov 22 04:48:43 crc kubenswrapper[4699]: I1122 04:48:43.024585 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjgk6_2840ab61-4c34-4132-970e-c6d8c615c2bd/cp-frr-files/0.log" Nov 22 04:48:43 crc kubenswrapper[4699]: I1122 04:48:43.072940 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjgk6_2840ab61-4c34-4132-970e-c6d8c615c2bd/cp-reloader/0.log" Nov 22 04:48:43 crc kubenswrapper[4699]: I1122 04:48:43.074684 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjgk6_2840ab61-4c34-4132-970e-c6d8c615c2bd/cp-metrics/0.log" Nov 22 04:48:43 crc kubenswrapper[4699]: I1122 04:48:43.075203 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjgk6_2840ab61-4c34-4132-970e-c6d8c615c2bd/cp-reloader/0.log" Nov 22 04:48:43 crc kubenswrapper[4699]: I1122 04:48:43.265300 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjgk6_2840ab61-4c34-4132-970e-c6d8c615c2bd/cp-frr-files/0.log" Nov 22 04:48:43 crc kubenswrapper[4699]: I1122 04:48:43.314526 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjgk6_2840ab61-4c34-4132-970e-c6d8c615c2bd/cp-reloader/0.log" Nov 22 04:48:43 crc kubenswrapper[4699]: I1122 04:48:43.329461 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjgk6_2840ab61-4c34-4132-970e-c6d8c615c2bd/cp-metrics/0.log" Nov 22 04:48:43 crc kubenswrapper[4699]: I1122 04:48:43.330956 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjgk6_2840ab61-4c34-4132-970e-c6d8c615c2bd/cp-metrics/0.log" Nov 22 04:48:43 crc kubenswrapper[4699]: I1122 04:48:43.480728 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjgk6_2840ab61-4c34-4132-970e-c6d8c615c2bd/cp-frr-files/0.log" Nov 22 04:48:43 crc kubenswrapper[4699]: I1122 04:48:43.497117 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjgk6_2840ab61-4c34-4132-970e-c6d8c615c2bd/cp-reloader/0.log" Nov 22 04:48:43 crc kubenswrapper[4699]: I1122 04:48:43.539527 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjgk6_2840ab61-4c34-4132-970e-c6d8c615c2bd/cp-metrics/0.log" Nov 22 04:48:43 crc kubenswrapper[4699]: I1122 04:48:43.544175 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjgk6_2840ab61-4c34-4132-970e-c6d8c615c2bd/controller/0.log" Nov 22 04:48:43 crc kubenswrapper[4699]: I1122 04:48:43.658057 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjgk6_2840ab61-4c34-4132-970e-c6d8c615c2bd/frr-metrics/0.log" Nov 22 04:48:43 crc kubenswrapper[4699]: I1122 04:48:43.719214 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjgk6_2840ab61-4c34-4132-970e-c6d8c615c2bd/kube-rbac-proxy/0.log" Nov 22 04:48:43 crc kubenswrapper[4699]: I1122 04:48:43.757420 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjgk6_2840ab61-4c34-4132-970e-c6d8c615c2bd/kube-rbac-proxy-frr/0.log" Nov 22 04:48:43 crc kubenswrapper[4699]: I1122 04:48:43.843003 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjgk6_2840ab61-4c34-4132-970e-c6d8c615c2bd/reloader/0.log" Nov 22 04:48:43 crc kubenswrapper[4699]: I1122 04:48:43.998682 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-6998585d5-5tpp7_3975d03a-cd82-4ae3-89cb-fcad5f75330c/frr-k8s-webhook-server/0.log" Nov 22 04:48:44 crc kubenswrapper[4699]: I1122 04:48:44.166993 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7c65d8d687-6vpd9_f195708e-47e9-45a0-8361-7bbe6b6c6c0b/manager/0.log" Nov 22 04:48:44 crc kubenswrapper[4699]: I1122 04:48:44.303887 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-785654ff4c-blk7v_fbc128ff-e74b-44ee-a1a0-553a38bc79c7/webhook-server/0.log" Nov 22 04:48:44 crc kubenswrapper[4699]: I1122 04:48:44.466927 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-psxdv_07114818-b4f9-465d-9745-c8a05af60e5a/kube-rbac-proxy/0.log" Nov 22 04:48:44 crc kubenswrapper[4699]: I1122 04:48:44.643711 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-fjgk6_2840ab61-4c34-4132-970e-c6d8c615c2bd/frr/0.log" Nov 22 04:48:44 crc kubenswrapper[4699]: I1122 04:48:44.833286 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-psxdv_07114818-b4f9-465d-9745-c8a05af60e5a/speaker/0.log" Nov 22 04:48:55 crc kubenswrapper[4699]: I1122 04:48:55.447916 4699 scope.go:117] "RemoveContainer" containerID="09c1fed9b7151bf7233b81b1d357689f9d97efd829231216c50a7352fe56ab7b" Nov 22 04:48:55 crc kubenswrapper[4699]: E1122 04:48:55.449203 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kjwnt_openshift-machine-config-operator(41bdbae2-706a-4f84-9f56-5a42aec77762)\"" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" Nov 22 04:48:56 crc kubenswrapper[4699]: I1122 04:48:56.826932 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr_724005e9-061b-46d4-84ce-611d0ddaa0e5/util/0.log" Nov 22 04:48:57 crc kubenswrapper[4699]: I1122 04:48:57.043725 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr_724005e9-061b-46d4-84ce-611d0ddaa0e5/util/0.log" Nov 22 04:48:57 crc kubenswrapper[4699]: I1122 04:48:57.044866 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr_724005e9-061b-46d4-84ce-611d0ddaa0e5/pull/0.log" Nov 22 04:48:57 crc kubenswrapper[4699]: I1122 04:48:57.055317 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr_724005e9-061b-46d4-84ce-611d0ddaa0e5/pull/0.log" Nov 22 04:48:57 crc kubenswrapper[4699]: I1122 04:48:57.222378 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr_724005e9-061b-46d4-84ce-611d0ddaa0e5/util/0.log" Nov 22 04:48:57 crc kubenswrapper[4699]: I1122 04:48:57.222514 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr_724005e9-061b-46d4-84ce-611d0ddaa0e5/extract/0.log" Nov 22 04:48:57 crc kubenswrapper[4699]: I1122 04:48:57.241365 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e9hpbr_724005e9-061b-46d4-84ce-611d0ddaa0e5/pull/0.log" Nov 22 04:48:57 crc kubenswrapper[4699]: I1122 04:48:57.371268 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vzxlt_9aa53af0-a1c6-48c6-b081-e100d6a6512f/extract-utilities/0.log" Nov 22 04:48:57 crc kubenswrapper[4699]: I1122 04:48:57.531895 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vzxlt_9aa53af0-a1c6-48c6-b081-e100d6a6512f/extract-utilities/0.log" Nov 22 04:48:57 crc kubenswrapper[4699]: I1122 04:48:57.546543 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vzxlt_9aa53af0-a1c6-48c6-b081-e100d6a6512f/extract-content/0.log" Nov 22 04:48:57 crc kubenswrapper[4699]: I1122 04:48:57.572137 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vzxlt_9aa53af0-a1c6-48c6-b081-e100d6a6512f/extract-content/0.log" Nov 22 04:48:57 crc kubenswrapper[4699]: I1122 04:48:57.759817 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vzxlt_9aa53af0-a1c6-48c6-b081-e100d6a6512f/extract-content/0.log" Nov 22 04:48:57 crc kubenswrapper[4699]: I1122 04:48:57.760152 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vzxlt_9aa53af0-a1c6-48c6-b081-e100d6a6512f/extract-utilities/0.log" Nov 22 04:48:57 crc kubenswrapper[4699]: I1122 04:48:57.973303 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wzwkb_4e5709de-6870-45ee-979d-4cf3c01d2b20/extract-utilities/0.log" Nov 22 04:48:58 crc kubenswrapper[4699]: I1122 04:48:58.069137 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vzxlt_9aa53af0-a1c6-48c6-b081-e100d6a6512f/registry-server/0.log" Nov 22 04:48:58 crc kubenswrapper[4699]: I1122 04:48:58.105423 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wzwkb_4e5709de-6870-45ee-979d-4cf3c01d2b20/extract-utilities/0.log" Nov 22 04:48:58 crc kubenswrapper[4699]: I1122 04:48:58.159267 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wzwkb_4e5709de-6870-45ee-979d-4cf3c01d2b20/extract-content/0.log" Nov 22 04:48:58 crc kubenswrapper[4699]: I1122 04:48:58.159737 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wzwkb_4e5709de-6870-45ee-979d-4cf3c01d2b20/extract-content/0.log" Nov 22 04:48:58 crc kubenswrapper[4699]: I1122 04:48:58.368805 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wzwkb_4e5709de-6870-45ee-979d-4cf3c01d2b20/extract-utilities/0.log" Nov 22 04:48:58 crc kubenswrapper[4699]: I1122 04:48:58.378673 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wzwkb_4e5709de-6870-45ee-979d-4cf3c01d2b20/extract-content/0.log" Nov 22 04:48:58 crc kubenswrapper[4699]: I1122 04:48:58.639175 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6mmt7g_46bd6dfa-3553-4128-9412-a6d995e86f82/util/0.log" Nov 22 04:48:58 crc kubenswrapper[4699]: I1122 04:48:58.868552 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wzwkb_4e5709de-6870-45ee-979d-4cf3c01d2b20/registry-server/0.log" Nov 22 04:48:58 crc kubenswrapper[4699]: I1122 04:48:58.921647 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6mmt7g_46bd6dfa-3553-4128-9412-a6d995e86f82/util/0.log" Nov 22 04:48:58 crc kubenswrapper[4699]: I1122 04:48:58.942936 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6mmt7g_46bd6dfa-3553-4128-9412-a6d995e86f82/pull/0.log" Nov 22 04:48:58 crc kubenswrapper[4699]: I1122 04:48:58.948660 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6mmt7g_46bd6dfa-3553-4128-9412-a6d995e86f82/pull/0.log" Nov 22 04:48:59 crc kubenswrapper[4699]: I1122 04:48:59.108489 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6mmt7g_46bd6dfa-3553-4128-9412-a6d995e86f82/util/0.log" Nov 22 04:48:59 crc kubenswrapper[4699]: I1122 04:48:59.124624 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6mmt7g_46bd6dfa-3553-4128-9412-a6d995e86f82/pull/0.log" Nov 22 04:48:59 crc kubenswrapper[4699]: I1122 04:48:59.146843 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c6mmt7g_46bd6dfa-3553-4128-9412-a6d995e86f82/extract/0.log" Nov 22 04:48:59 crc kubenswrapper[4699]: I1122 04:48:59.327757 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-js84l_096fc045-af3a-4dff-bfb9-aad031dc0cc0/extract-utilities/0.log" Nov 22 04:48:59 crc kubenswrapper[4699]: I1122 04:48:59.331479 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-8pl4v_cfa86e4d-ee3e-4839-af4e-966184a73dc9/marketplace-operator/0.log" Nov 22 04:48:59 crc kubenswrapper[4699]: I1122 04:48:59.520004 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-js84l_096fc045-af3a-4dff-bfb9-aad031dc0cc0/extract-content/0.log" Nov 22 04:48:59 crc kubenswrapper[4699]: I1122 04:48:59.554544 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-js84l_096fc045-af3a-4dff-bfb9-aad031dc0cc0/extract-utilities/0.log" Nov 22 04:48:59 crc kubenswrapper[4699]: I1122 04:48:59.562071 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-js84l_096fc045-af3a-4dff-bfb9-aad031dc0cc0/extract-content/0.log" Nov 22 04:48:59 crc kubenswrapper[4699]: I1122 04:48:59.738875 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-js84l_096fc045-af3a-4dff-bfb9-aad031dc0cc0/extract-utilities/0.log" Nov 22 04:48:59 crc kubenswrapper[4699]: I1122 04:48:59.753125 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-js84l_096fc045-af3a-4dff-bfb9-aad031dc0cc0/extract-content/0.log" Nov 22 04:48:59 crc kubenswrapper[4699]: I1122 04:48:59.846616 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-js84l_096fc045-af3a-4dff-bfb9-aad031dc0cc0/registry-server/0.log" Nov 22 04:48:59 crc kubenswrapper[4699]: I1122 04:48:59.992158 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wdh9z_8ee0f1f9-a840-4cb2-828e-99b87f87d60e/extract-utilities/0.log" Nov 22 04:49:00 crc kubenswrapper[4699]: I1122 04:49:00.115514 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wdh9z_8ee0f1f9-a840-4cb2-828e-99b87f87d60e/extract-content/0.log" Nov 22 04:49:00 crc kubenswrapper[4699]: I1122 04:49:00.142607 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wdh9z_8ee0f1f9-a840-4cb2-828e-99b87f87d60e/extract-content/0.log" Nov 22 04:49:00 crc kubenswrapper[4699]: I1122 04:49:00.147874 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wdh9z_8ee0f1f9-a840-4cb2-828e-99b87f87d60e/extract-utilities/0.log" Nov 22 04:49:00 crc kubenswrapper[4699]: I1122 04:49:00.310926 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wdh9z_8ee0f1f9-a840-4cb2-828e-99b87f87d60e/extract-content/0.log" Nov 22 04:49:00 crc kubenswrapper[4699]: I1122 04:49:00.343070 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wdh9z_8ee0f1f9-a840-4cb2-828e-99b87f87d60e/extract-utilities/0.log" Nov 22 04:49:00 crc kubenswrapper[4699]: I1122 04:49:00.618305 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wdh9z_8ee0f1f9-a840-4cb2-828e-99b87f87d60e/registry-server/0.log" Nov 22 04:49:06 crc kubenswrapper[4699]: I1122 04:49:06.447755 4699 scope.go:117] "RemoveContainer" containerID="09c1fed9b7151bf7233b81b1d357689f9d97efd829231216c50a7352fe56ab7b" Nov 22 04:49:06 crc kubenswrapper[4699]: E1122 04:49:06.448375 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kjwnt_openshift-machine-config-operator(41bdbae2-706a-4f84-9f56-5a42aec77762)\"" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" Nov 22 04:49:21 crc kubenswrapper[4699]: I1122 04:49:21.451050 4699 scope.go:117] "RemoveContainer" containerID="09c1fed9b7151bf7233b81b1d357689f9d97efd829231216c50a7352fe56ab7b" Nov 22 04:49:21 crc kubenswrapper[4699]: E1122 04:49:21.451966 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kjwnt_openshift-machine-config-operator(41bdbae2-706a-4f84-9f56-5a42aec77762)\"" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" Nov 22 04:49:36 crc kubenswrapper[4699]: I1122 04:49:36.447277 4699 scope.go:117] "RemoveContainer" containerID="09c1fed9b7151bf7233b81b1d357689f9d97efd829231216c50a7352fe56ab7b" Nov 22 04:49:36 crc kubenswrapper[4699]: E1122 04:49:36.448107 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kjwnt_openshift-machine-config-operator(41bdbae2-706a-4f84-9f56-5a42aec77762)\"" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" Nov 22 04:49:48 crc kubenswrapper[4699]: I1122 04:49:48.448109 4699 scope.go:117] "RemoveContainer" containerID="09c1fed9b7151bf7233b81b1d357689f9d97efd829231216c50a7352fe56ab7b" Nov 22 04:49:48 crc kubenswrapper[4699]: E1122 04:49:48.449299 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kjwnt_openshift-machine-config-operator(41bdbae2-706a-4f84-9f56-5a42aec77762)\"" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" Nov 22 04:50:01 crc kubenswrapper[4699]: I1122 04:50:01.450988 4699 scope.go:117] "RemoveContainer" containerID="09c1fed9b7151bf7233b81b1d357689f9d97efd829231216c50a7352fe56ab7b" Nov 22 04:50:01 crc kubenswrapper[4699]: E1122 04:50:01.451796 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kjwnt_openshift-machine-config-operator(41bdbae2-706a-4f84-9f56-5a42aec77762)\"" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" Nov 22 04:50:15 crc kubenswrapper[4699]: I1122 04:50:15.448183 4699 scope.go:117] "RemoveContainer" containerID="09c1fed9b7151bf7233b81b1d357689f9d97efd829231216c50a7352fe56ab7b" Nov 22 04:50:15 crc kubenswrapper[4699]: E1122 04:50:15.449011 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kjwnt_openshift-machine-config-operator(41bdbae2-706a-4f84-9f56-5a42aec77762)\"" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" Nov 22 04:50:23 crc kubenswrapper[4699]: I1122 04:50:23.145153 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6tppv"] Nov 22 04:50:23 crc kubenswrapper[4699]: E1122 04:50:23.146871 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcc1b4ec-d5fe-4a55-bb8d-fdbcb12db6e0" containerName="container-00" Nov 22 04:50:23 crc kubenswrapper[4699]: I1122 04:50:23.146906 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcc1b4ec-d5fe-4a55-bb8d-fdbcb12db6e0" containerName="container-00" Nov 22 04:50:23 crc kubenswrapper[4699]: I1122 04:50:23.147408 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcc1b4ec-d5fe-4a55-bb8d-fdbcb12db6e0" containerName="container-00" Nov 22 04:50:23 crc kubenswrapper[4699]: I1122 04:50:23.151028 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6tppv" Nov 22 04:50:23 crc kubenswrapper[4699]: I1122 04:50:23.162807 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6tppv"] Nov 22 04:50:23 crc kubenswrapper[4699]: I1122 04:50:23.295412 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt566\" (UniqueName: \"kubernetes.io/projected/388758ee-ef9f-42a3-8435-df26ceca3878-kube-api-access-xt566\") pod \"redhat-operators-6tppv\" (UID: \"388758ee-ef9f-42a3-8435-df26ceca3878\") " pod="openshift-marketplace/redhat-operators-6tppv" Nov 22 04:50:23 crc kubenswrapper[4699]: I1122 04:50:23.297096 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/388758ee-ef9f-42a3-8435-df26ceca3878-utilities\") pod \"redhat-operators-6tppv\" (UID: \"388758ee-ef9f-42a3-8435-df26ceca3878\") " pod="openshift-marketplace/redhat-operators-6tppv" Nov 22 04:50:23 crc kubenswrapper[4699]: I1122 04:50:23.297154 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/388758ee-ef9f-42a3-8435-df26ceca3878-catalog-content\") pod \"redhat-operators-6tppv\" (UID: \"388758ee-ef9f-42a3-8435-df26ceca3878\") " pod="openshift-marketplace/redhat-operators-6tppv" Nov 22 04:50:23 crc kubenswrapper[4699]: I1122 04:50:23.399010 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt566\" (UniqueName: \"kubernetes.io/projected/388758ee-ef9f-42a3-8435-df26ceca3878-kube-api-access-xt566\") pod \"redhat-operators-6tppv\" (UID: \"388758ee-ef9f-42a3-8435-df26ceca3878\") " pod="openshift-marketplace/redhat-operators-6tppv" Nov 22 04:50:23 crc kubenswrapper[4699]: I1122 04:50:23.399133 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/388758ee-ef9f-42a3-8435-df26ceca3878-utilities\") pod \"redhat-operators-6tppv\" (UID: \"388758ee-ef9f-42a3-8435-df26ceca3878\") " pod="openshift-marketplace/redhat-operators-6tppv" Nov 22 04:50:23 crc kubenswrapper[4699]: I1122 04:50:23.399166 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/388758ee-ef9f-42a3-8435-df26ceca3878-catalog-content\") pod \"redhat-operators-6tppv\" (UID: \"388758ee-ef9f-42a3-8435-df26ceca3878\") " pod="openshift-marketplace/redhat-operators-6tppv" Nov 22 04:50:23 crc kubenswrapper[4699]: I1122 04:50:23.399785 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/388758ee-ef9f-42a3-8435-df26ceca3878-catalog-content\") pod \"redhat-operators-6tppv\" (UID: \"388758ee-ef9f-42a3-8435-df26ceca3878\") " pod="openshift-marketplace/redhat-operators-6tppv" Nov 22 04:50:23 crc kubenswrapper[4699]: I1122 04:50:23.400020 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/388758ee-ef9f-42a3-8435-df26ceca3878-utilities\") pod \"redhat-operators-6tppv\" (UID: \"388758ee-ef9f-42a3-8435-df26ceca3878\") " pod="openshift-marketplace/redhat-operators-6tppv" Nov 22 04:50:23 crc kubenswrapper[4699]: I1122 04:50:23.421155 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt566\" (UniqueName: \"kubernetes.io/projected/388758ee-ef9f-42a3-8435-df26ceca3878-kube-api-access-xt566\") pod \"redhat-operators-6tppv\" (UID: \"388758ee-ef9f-42a3-8435-df26ceca3878\") " pod="openshift-marketplace/redhat-operators-6tppv" Nov 22 04:50:23 crc kubenswrapper[4699]: I1122 04:50:23.479832 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6tppv" Nov 22 04:50:24 crc kubenswrapper[4699]: I1122 04:50:24.488193 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6tppv"] Nov 22 04:50:25 crc kubenswrapper[4699]: I1122 04:50:25.251472 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6tppv" event={"ID":"388758ee-ef9f-42a3-8435-df26ceca3878","Type":"ContainerDied","Data":"9bab7b087dadb5c3a1d5cd9619aade9ec461fa6eaab6a08f4642cb692e998a7e"} Nov 22 04:50:25 crc kubenswrapper[4699]: I1122 04:50:25.251416 4699 generic.go:334] "Generic (PLEG): container finished" podID="388758ee-ef9f-42a3-8435-df26ceca3878" containerID="9bab7b087dadb5c3a1d5cd9619aade9ec461fa6eaab6a08f4642cb692e998a7e" exitCode=0 Nov 22 04:50:25 crc kubenswrapper[4699]: I1122 04:50:25.251916 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6tppv" event={"ID":"388758ee-ef9f-42a3-8435-df26ceca3878","Type":"ContainerStarted","Data":"2ad0a8aa6a172f544a0a6636b3a06477f424f486e766f870ee62e72db9f593e9"} Nov 22 04:50:25 crc kubenswrapper[4699]: I1122 04:50:25.254376 4699 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 04:50:27 crc kubenswrapper[4699]: I1122 04:50:27.448650 4699 scope.go:117] "RemoveContainer" containerID="09c1fed9b7151bf7233b81b1d357689f9d97efd829231216c50a7352fe56ab7b" Nov 22 04:50:27 crc kubenswrapper[4699]: E1122 04:50:27.449075 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kjwnt_openshift-machine-config-operator(41bdbae2-706a-4f84-9f56-5a42aec77762)\"" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" Nov 22 04:50:29 crc kubenswrapper[4699]: I1122 04:50:29.313087 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6tppv" event={"ID":"388758ee-ef9f-42a3-8435-df26ceca3878","Type":"ContainerStarted","Data":"afb44876b641bdeb6ce42e50a1c61390720922c7e747b0b18a11a85181e81600"} Nov 22 04:50:33 crc kubenswrapper[4699]: I1122 04:50:33.363269 4699 generic.go:334] "Generic (PLEG): container finished" podID="388758ee-ef9f-42a3-8435-df26ceca3878" containerID="afb44876b641bdeb6ce42e50a1c61390720922c7e747b0b18a11a85181e81600" exitCode=0 Nov 22 04:50:33 crc kubenswrapper[4699]: I1122 04:50:33.363383 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6tppv" event={"ID":"388758ee-ef9f-42a3-8435-df26ceca3878","Type":"ContainerDied","Data":"afb44876b641bdeb6ce42e50a1c61390720922c7e747b0b18a11a85181e81600"} Nov 22 04:50:33 crc kubenswrapper[4699]: I1122 04:50:33.367252 4699 generic.go:334] "Generic (PLEG): container finished" podID="550ab800-e31d-4c0d-8bbc-6538ce378e8e" containerID="083cc5ad80d11505455451c3240f9aff38dda9a72398afd9e2452193c2c87e88" exitCode=0 Nov 22 04:50:33 crc kubenswrapper[4699]: I1122 04:50:33.367285 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kg9ms/must-gather-dt6xg" event={"ID":"550ab800-e31d-4c0d-8bbc-6538ce378e8e","Type":"ContainerDied","Data":"083cc5ad80d11505455451c3240f9aff38dda9a72398afd9e2452193c2c87e88"} Nov 22 04:50:33 crc kubenswrapper[4699]: I1122 04:50:33.367742 4699 scope.go:117] "RemoveContainer" containerID="083cc5ad80d11505455451c3240f9aff38dda9a72398afd9e2452193c2c87e88" Nov 22 04:50:33 crc kubenswrapper[4699]: I1122 04:50:33.467196 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kg9ms_must-gather-dt6xg_550ab800-e31d-4c0d-8bbc-6538ce378e8e/gather/0.log" Nov 22 04:50:34 crc kubenswrapper[4699]: I1122 04:50:34.380266 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6tppv" event={"ID":"388758ee-ef9f-42a3-8435-df26ceca3878","Type":"ContainerStarted","Data":"e6f7219cec1b6253997ee13975ad0be9a9a253bfcfe83bd3ec0d92f0b62681d9"} Nov 22 04:50:35 crc kubenswrapper[4699]: I1122 04:50:35.409033 4699 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6tppv" podStartSLOduration=3.564403791 podStartE2EDuration="12.40901374s" podCreationTimestamp="2025-11-22 04:50:23 +0000 UTC" firstStartedPulling="2025-11-22 04:50:25.254051827 +0000 UTC m=+2576.596673034" lastFinishedPulling="2025-11-22 04:50:34.098661796 +0000 UTC m=+2585.441282983" observedRunningTime="2025-11-22 04:50:35.401010318 +0000 UTC m=+2586.743631505" watchObservedRunningTime="2025-11-22 04:50:35.40901374 +0000 UTC m=+2586.751634927" Nov 22 04:50:38 crc kubenswrapper[4699]: I1122 04:50:38.448161 4699 scope.go:117] "RemoveContainer" containerID="09c1fed9b7151bf7233b81b1d357689f9d97efd829231216c50a7352fe56ab7b" Nov 22 04:50:38 crc kubenswrapper[4699]: E1122 04:50:38.448883 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kjwnt_openshift-machine-config-operator(41bdbae2-706a-4f84-9f56-5a42aec77762)\"" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" Nov 22 04:50:43 crc kubenswrapper[4699]: I1122 04:50:43.480883 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6tppv" Nov 22 04:50:43 crc kubenswrapper[4699]: I1122 04:50:43.482288 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6tppv" Nov 22 04:50:43 crc kubenswrapper[4699]: I1122 04:50:43.538078 4699 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6tppv" Nov 22 04:50:44 crc kubenswrapper[4699]: I1122 04:50:44.205849 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kg9ms/must-gather-dt6xg"] Nov 22 04:50:44 crc kubenswrapper[4699]: I1122 04:50:44.206149 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-kg9ms/must-gather-dt6xg" podUID="550ab800-e31d-4c0d-8bbc-6538ce378e8e" containerName="copy" containerID="cri-o://45d1d8e94f72ffe58772f50e0a4f73691a142d310516f8644240a5ad77066ebf" gracePeriod=2 Nov 22 04:50:44 crc kubenswrapper[4699]: I1122 04:50:44.272755 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kg9ms/must-gather-dt6xg"] Nov 22 04:50:44 crc kubenswrapper[4699]: I1122 04:50:44.489618 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kg9ms_must-gather-dt6xg_550ab800-e31d-4c0d-8bbc-6538ce378e8e/copy/0.log" Nov 22 04:50:44 crc kubenswrapper[4699]: I1122 04:50:44.492395 4699 generic.go:334] "Generic (PLEG): container finished" podID="550ab800-e31d-4c0d-8bbc-6538ce378e8e" containerID="45d1d8e94f72ffe58772f50e0a4f73691a142d310516f8644240a5ad77066ebf" exitCode=143 Nov 22 04:50:44 crc kubenswrapper[4699]: I1122 04:50:44.566136 4699 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6tppv" Nov 22 04:50:44 crc kubenswrapper[4699]: I1122 04:50:44.623927 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6tppv"] Nov 22 04:50:44 crc kubenswrapper[4699]: I1122 04:50:44.796832 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kg9ms_must-gather-dt6xg_550ab800-e31d-4c0d-8bbc-6538ce378e8e/copy/0.log" Nov 22 04:50:44 crc kubenswrapper[4699]: I1122 04:50:44.797345 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kg9ms/must-gather-dt6xg" Nov 22 04:50:44 crc kubenswrapper[4699]: I1122 04:50:44.982603 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzt9z\" (UniqueName: \"kubernetes.io/projected/550ab800-e31d-4c0d-8bbc-6538ce378e8e-kube-api-access-gzt9z\") pod \"550ab800-e31d-4c0d-8bbc-6538ce378e8e\" (UID: \"550ab800-e31d-4c0d-8bbc-6538ce378e8e\") " Nov 22 04:50:44 crc kubenswrapper[4699]: I1122 04:50:44.982726 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/550ab800-e31d-4c0d-8bbc-6538ce378e8e-must-gather-output\") pod \"550ab800-e31d-4c0d-8bbc-6538ce378e8e\" (UID: \"550ab800-e31d-4c0d-8bbc-6538ce378e8e\") " Nov 22 04:50:44 crc kubenswrapper[4699]: I1122 04:50:44.995384 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/550ab800-e31d-4c0d-8bbc-6538ce378e8e-kube-api-access-gzt9z" (OuterVolumeSpecName: "kube-api-access-gzt9z") pod "550ab800-e31d-4c0d-8bbc-6538ce378e8e" (UID: "550ab800-e31d-4c0d-8bbc-6538ce378e8e"). InnerVolumeSpecName "kube-api-access-gzt9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:50:45 crc kubenswrapper[4699]: I1122 04:50:45.085179 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzt9z\" (UniqueName: \"kubernetes.io/projected/550ab800-e31d-4c0d-8bbc-6538ce378e8e-kube-api-access-gzt9z\") on node \"crc\" DevicePath \"\"" Nov 22 04:50:45 crc kubenswrapper[4699]: I1122 04:50:45.124418 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/550ab800-e31d-4c0d-8bbc-6538ce378e8e-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "550ab800-e31d-4c0d-8bbc-6538ce378e8e" (UID: "550ab800-e31d-4c0d-8bbc-6538ce378e8e"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:50:45 crc kubenswrapper[4699]: I1122 04:50:45.187181 4699 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/550ab800-e31d-4c0d-8bbc-6538ce378e8e-must-gather-output\") on node \"crc\" DevicePath \"\"" Nov 22 04:50:45 crc kubenswrapper[4699]: I1122 04:50:45.464804 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="550ab800-e31d-4c0d-8bbc-6538ce378e8e" path="/var/lib/kubelet/pods/550ab800-e31d-4c0d-8bbc-6538ce378e8e/volumes" Nov 22 04:50:45 crc kubenswrapper[4699]: I1122 04:50:45.505513 4699 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kg9ms_must-gather-dt6xg_550ab800-e31d-4c0d-8bbc-6538ce378e8e/copy/0.log" Nov 22 04:50:45 crc kubenswrapper[4699]: I1122 04:50:45.505992 4699 scope.go:117] "RemoveContainer" containerID="45d1d8e94f72ffe58772f50e0a4f73691a142d310516f8644240a5ad77066ebf" Nov 22 04:50:45 crc kubenswrapper[4699]: I1122 04:50:45.506022 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kg9ms/must-gather-dt6xg" Nov 22 04:50:45 crc kubenswrapper[4699]: I1122 04:50:45.531341 4699 scope.go:117] "RemoveContainer" containerID="083cc5ad80d11505455451c3240f9aff38dda9a72398afd9e2452193c2c87e88" Nov 22 04:50:46 crc kubenswrapper[4699]: I1122 04:50:46.515050 4699 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6tppv" podUID="388758ee-ef9f-42a3-8435-df26ceca3878" containerName="registry-server" containerID="cri-o://e6f7219cec1b6253997ee13975ad0be9a9a253bfcfe83bd3ec0d92f0b62681d9" gracePeriod=2 Nov 22 04:50:47 crc kubenswrapper[4699]: I1122 04:50:47.038777 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6tppv" Nov 22 04:50:47 crc kubenswrapper[4699]: I1122 04:50:47.227261 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xt566\" (UniqueName: \"kubernetes.io/projected/388758ee-ef9f-42a3-8435-df26ceca3878-kube-api-access-xt566\") pod \"388758ee-ef9f-42a3-8435-df26ceca3878\" (UID: \"388758ee-ef9f-42a3-8435-df26ceca3878\") " Nov 22 04:50:47 crc kubenswrapper[4699]: I1122 04:50:47.227446 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/388758ee-ef9f-42a3-8435-df26ceca3878-catalog-content\") pod \"388758ee-ef9f-42a3-8435-df26ceca3878\" (UID: \"388758ee-ef9f-42a3-8435-df26ceca3878\") " Nov 22 04:50:47 crc kubenswrapper[4699]: I1122 04:50:47.227571 4699 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/388758ee-ef9f-42a3-8435-df26ceca3878-utilities\") pod \"388758ee-ef9f-42a3-8435-df26ceca3878\" (UID: \"388758ee-ef9f-42a3-8435-df26ceca3878\") " Nov 22 04:50:47 crc kubenswrapper[4699]: I1122 04:50:47.228310 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/388758ee-ef9f-42a3-8435-df26ceca3878-utilities" (OuterVolumeSpecName: "utilities") pod "388758ee-ef9f-42a3-8435-df26ceca3878" (UID: "388758ee-ef9f-42a3-8435-df26ceca3878"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:50:47 crc kubenswrapper[4699]: I1122 04:50:47.232448 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/388758ee-ef9f-42a3-8435-df26ceca3878-kube-api-access-xt566" (OuterVolumeSpecName: "kube-api-access-xt566") pod "388758ee-ef9f-42a3-8435-df26ceca3878" (UID: "388758ee-ef9f-42a3-8435-df26ceca3878"). InnerVolumeSpecName "kube-api-access-xt566". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 04:50:47 crc kubenswrapper[4699]: I1122 04:50:47.329634 4699 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/388758ee-ef9f-42a3-8435-df26ceca3878-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 04:50:47 crc kubenswrapper[4699]: I1122 04:50:47.329688 4699 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xt566\" (UniqueName: \"kubernetes.io/projected/388758ee-ef9f-42a3-8435-df26ceca3878-kube-api-access-xt566\") on node \"crc\" DevicePath \"\"" Nov 22 04:50:47 crc kubenswrapper[4699]: I1122 04:50:47.330379 4699 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/388758ee-ef9f-42a3-8435-df26ceca3878-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "388758ee-ef9f-42a3-8435-df26ceca3878" (UID: "388758ee-ef9f-42a3-8435-df26ceca3878"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 04:50:47 crc kubenswrapper[4699]: I1122 04:50:47.436197 4699 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/388758ee-ef9f-42a3-8435-df26ceca3878-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 04:50:47 crc kubenswrapper[4699]: I1122 04:50:47.526057 4699 generic.go:334] "Generic (PLEG): container finished" podID="388758ee-ef9f-42a3-8435-df26ceca3878" containerID="e6f7219cec1b6253997ee13975ad0be9a9a253bfcfe83bd3ec0d92f0b62681d9" exitCode=0 Nov 22 04:50:47 crc kubenswrapper[4699]: I1122 04:50:47.526110 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6tppv" event={"ID":"388758ee-ef9f-42a3-8435-df26ceca3878","Type":"ContainerDied","Data":"e6f7219cec1b6253997ee13975ad0be9a9a253bfcfe83bd3ec0d92f0b62681d9"} Nov 22 04:50:47 crc kubenswrapper[4699]: I1122 04:50:47.526116 4699 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6tppv" Nov 22 04:50:47 crc kubenswrapper[4699]: I1122 04:50:47.526145 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6tppv" event={"ID":"388758ee-ef9f-42a3-8435-df26ceca3878","Type":"ContainerDied","Data":"2ad0a8aa6a172f544a0a6636b3a06477f424f486e766f870ee62e72db9f593e9"} Nov 22 04:50:47 crc kubenswrapper[4699]: I1122 04:50:47.526169 4699 scope.go:117] "RemoveContainer" containerID="e6f7219cec1b6253997ee13975ad0be9a9a253bfcfe83bd3ec0d92f0b62681d9" Nov 22 04:50:47 crc kubenswrapper[4699]: I1122 04:50:47.555148 4699 scope.go:117] "RemoveContainer" containerID="afb44876b641bdeb6ce42e50a1c61390720922c7e747b0b18a11a85181e81600" Nov 22 04:50:47 crc kubenswrapper[4699]: I1122 04:50:47.558614 4699 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6tppv"] Nov 22 04:50:47 crc kubenswrapper[4699]: I1122 04:50:47.567945 4699 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6tppv"] Nov 22 04:50:47 crc kubenswrapper[4699]: I1122 04:50:47.580612 4699 scope.go:117] "RemoveContainer" containerID="9bab7b087dadb5c3a1d5cd9619aade9ec461fa6eaab6a08f4642cb692e998a7e" Nov 22 04:50:47 crc kubenswrapper[4699]: I1122 04:50:47.636741 4699 scope.go:117] "RemoveContainer" containerID="e6f7219cec1b6253997ee13975ad0be9a9a253bfcfe83bd3ec0d92f0b62681d9" Nov 22 04:50:47 crc kubenswrapper[4699]: E1122 04:50:47.637211 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6f7219cec1b6253997ee13975ad0be9a9a253bfcfe83bd3ec0d92f0b62681d9\": container with ID starting with e6f7219cec1b6253997ee13975ad0be9a9a253bfcfe83bd3ec0d92f0b62681d9 not found: ID does not exist" containerID="e6f7219cec1b6253997ee13975ad0be9a9a253bfcfe83bd3ec0d92f0b62681d9" Nov 22 04:50:47 crc kubenswrapper[4699]: I1122 04:50:47.637247 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6f7219cec1b6253997ee13975ad0be9a9a253bfcfe83bd3ec0d92f0b62681d9"} err="failed to get container status \"e6f7219cec1b6253997ee13975ad0be9a9a253bfcfe83bd3ec0d92f0b62681d9\": rpc error: code = NotFound desc = could not find container \"e6f7219cec1b6253997ee13975ad0be9a9a253bfcfe83bd3ec0d92f0b62681d9\": container with ID starting with e6f7219cec1b6253997ee13975ad0be9a9a253bfcfe83bd3ec0d92f0b62681d9 not found: ID does not exist" Nov 22 04:50:47 crc kubenswrapper[4699]: I1122 04:50:47.637275 4699 scope.go:117] "RemoveContainer" containerID="afb44876b641bdeb6ce42e50a1c61390720922c7e747b0b18a11a85181e81600" Nov 22 04:50:47 crc kubenswrapper[4699]: E1122 04:50:47.637636 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afb44876b641bdeb6ce42e50a1c61390720922c7e747b0b18a11a85181e81600\": container with ID starting with afb44876b641bdeb6ce42e50a1c61390720922c7e747b0b18a11a85181e81600 not found: ID does not exist" containerID="afb44876b641bdeb6ce42e50a1c61390720922c7e747b0b18a11a85181e81600" Nov 22 04:50:47 crc kubenswrapper[4699]: I1122 04:50:47.637662 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afb44876b641bdeb6ce42e50a1c61390720922c7e747b0b18a11a85181e81600"} err="failed to get container status \"afb44876b641bdeb6ce42e50a1c61390720922c7e747b0b18a11a85181e81600\": rpc error: code = NotFound desc = could not find container \"afb44876b641bdeb6ce42e50a1c61390720922c7e747b0b18a11a85181e81600\": container with ID starting with afb44876b641bdeb6ce42e50a1c61390720922c7e747b0b18a11a85181e81600 not found: ID does not exist" Nov 22 04:50:47 crc kubenswrapper[4699]: I1122 04:50:47.637681 4699 scope.go:117] "RemoveContainer" containerID="9bab7b087dadb5c3a1d5cd9619aade9ec461fa6eaab6a08f4642cb692e998a7e" Nov 22 04:50:47 crc kubenswrapper[4699]: E1122 04:50:47.637941 4699 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bab7b087dadb5c3a1d5cd9619aade9ec461fa6eaab6a08f4642cb692e998a7e\": container with ID starting with 9bab7b087dadb5c3a1d5cd9619aade9ec461fa6eaab6a08f4642cb692e998a7e not found: ID does not exist" containerID="9bab7b087dadb5c3a1d5cd9619aade9ec461fa6eaab6a08f4642cb692e998a7e" Nov 22 04:50:47 crc kubenswrapper[4699]: I1122 04:50:47.637968 4699 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bab7b087dadb5c3a1d5cd9619aade9ec461fa6eaab6a08f4642cb692e998a7e"} err="failed to get container status \"9bab7b087dadb5c3a1d5cd9619aade9ec461fa6eaab6a08f4642cb692e998a7e\": rpc error: code = NotFound desc = could not find container \"9bab7b087dadb5c3a1d5cd9619aade9ec461fa6eaab6a08f4642cb692e998a7e\": container with ID starting with 9bab7b087dadb5c3a1d5cd9619aade9ec461fa6eaab6a08f4642cb692e998a7e not found: ID does not exist" Nov 22 04:50:49 crc kubenswrapper[4699]: I1122 04:50:49.466681 4699 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="388758ee-ef9f-42a3-8435-df26ceca3878" path="/var/lib/kubelet/pods/388758ee-ef9f-42a3-8435-df26ceca3878/volumes" Nov 22 04:50:52 crc kubenswrapper[4699]: I1122 04:50:52.448967 4699 scope.go:117] "RemoveContainer" containerID="09c1fed9b7151bf7233b81b1d357689f9d97efd829231216c50a7352fe56ab7b" Nov 22 04:50:52 crc kubenswrapper[4699]: E1122 04:50:52.450405 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kjwnt_openshift-machine-config-operator(41bdbae2-706a-4f84-9f56-5a42aec77762)\"" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" Nov 22 04:51:06 crc kubenswrapper[4699]: I1122 04:51:06.448211 4699 scope.go:117] "RemoveContainer" containerID="09c1fed9b7151bf7233b81b1d357689f9d97efd829231216c50a7352fe56ab7b" Nov 22 04:51:06 crc kubenswrapper[4699]: E1122 04:51:06.449855 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kjwnt_openshift-machine-config-operator(41bdbae2-706a-4f84-9f56-5a42aec77762)\"" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" Nov 22 04:51:21 crc kubenswrapper[4699]: I1122 04:51:21.447748 4699 scope.go:117] "RemoveContainer" containerID="09c1fed9b7151bf7233b81b1d357689f9d97efd829231216c50a7352fe56ab7b" Nov 22 04:51:21 crc kubenswrapper[4699]: E1122 04:51:21.448624 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kjwnt_openshift-machine-config-operator(41bdbae2-706a-4f84-9f56-5a42aec77762)\"" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" Nov 22 04:51:34 crc kubenswrapper[4699]: I1122 04:51:34.448165 4699 scope.go:117] "RemoveContainer" containerID="09c1fed9b7151bf7233b81b1d357689f9d97efd829231216c50a7352fe56ab7b" Nov 22 04:51:34 crc kubenswrapper[4699]: E1122 04:51:34.448968 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kjwnt_openshift-machine-config-operator(41bdbae2-706a-4f84-9f56-5a42aec77762)\"" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" Nov 22 04:51:37 crc kubenswrapper[4699]: I1122 04:51:37.940685 4699 scope.go:117] "RemoveContainer" containerID="a29e5b6eb920ab281c89ef63d94ad0adbaaddffeccc77a20f117c3acd5fb0626" Nov 22 04:51:37 crc kubenswrapper[4699]: I1122 04:51:37.967977 4699 scope.go:117] "RemoveContainer" containerID="449f1dac301f326daf8b131769cdceb2c62928f0ef98833369aa6d98736a161a" Nov 22 04:51:37 crc kubenswrapper[4699]: I1122 04:51:37.987741 4699 scope.go:117] "RemoveContainer" containerID="092a26dc54153ef541491c9650e817f940dba814ec759d25952d5f323c3d885a" Nov 22 04:51:47 crc kubenswrapper[4699]: I1122 04:51:47.449214 4699 scope.go:117] "RemoveContainer" containerID="09c1fed9b7151bf7233b81b1d357689f9d97efd829231216c50a7352fe56ab7b" Nov 22 04:51:47 crc kubenswrapper[4699]: E1122 04:51:47.450131 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kjwnt_openshift-machine-config-operator(41bdbae2-706a-4f84-9f56-5a42aec77762)\"" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" Nov 22 04:52:01 crc kubenswrapper[4699]: I1122 04:52:01.447660 4699 scope.go:117] "RemoveContainer" containerID="09c1fed9b7151bf7233b81b1d357689f9d97efd829231216c50a7352fe56ab7b" Nov 22 04:52:01 crc kubenswrapper[4699]: E1122 04:52:01.448470 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kjwnt_openshift-machine-config-operator(41bdbae2-706a-4f84-9f56-5a42aec77762)\"" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" Nov 22 04:52:12 crc kubenswrapper[4699]: I1122 04:52:12.447840 4699 scope.go:117] "RemoveContainer" containerID="09c1fed9b7151bf7233b81b1d357689f9d97efd829231216c50a7352fe56ab7b" Nov 22 04:52:12 crc kubenswrapper[4699]: E1122 04:52:12.448790 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kjwnt_openshift-machine-config-operator(41bdbae2-706a-4f84-9f56-5a42aec77762)\"" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" Nov 22 04:52:24 crc kubenswrapper[4699]: I1122 04:52:24.448168 4699 scope.go:117] "RemoveContainer" containerID="09c1fed9b7151bf7233b81b1d357689f9d97efd829231216c50a7352fe56ab7b" Nov 22 04:52:24 crc kubenswrapper[4699]: E1122 04:52:24.449515 4699 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kjwnt_openshift-machine-config-operator(41bdbae2-706a-4f84-9f56-5a42aec77762)\"" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" podUID="41bdbae2-706a-4f84-9f56-5a42aec77762" Nov 22 04:52:38 crc kubenswrapper[4699]: I1122 04:52:38.096696 4699 scope.go:117] "RemoveContainer" containerID="4f167655a449b518fcd5349399944a03fac418b319b47bae385e032bde3ce6fe" Nov 22 04:52:39 crc kubenswrapper[4699]: I1122 04:52:39.460659 4699 scope.go:117] "RemoveContainer" containerID="09c1fed9b7151bf7233b81b1d357689f9d97efd829231216c50a7352fe56ab7b" Nov 22 04:52:40 crc kubenswrapper[4699]: I1122 04:52:40.661865 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kjwnt" event={"ID":"41bdbae2-706a-4f84-9f56-5a42aec77762","Type":"ContainerStarted","Data":"5553a0f3f1ad922192a1dadc00eeff38b0f57a1d0bc4542e10014a01d39e9144"} Nov 22 04:53:38 crc kubenswrapper[4699]: I1122 04:53:38.171490 4699 scope.go:117] "RemoveContainer" containerID="2b3231061f3317da9b61b189ccfb20ba35f7c8e24db57446eadd05c67539892e" Nov 22 04:54:19 crc kubenswrapper[4699]: I1122 04:54:19.040856 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hkpjt"] Nov 22 04:54:19 crc kubenswrapper[4699]: E1122 04:54:19.041883 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="388758ee-ef9f-42a3-8435-df26ceca3878" containerName="extract-content" Nov 22 04:54:19 crc kubenswrapper[4699]: I1122 04:54:19.041969 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="388758ee-ef9f-42a3-8435-df26ceca3878" containerName="extract-content" Nov 22 04:54:19 crc kubenswrapper[4699]: E1122 04:54:19.041986 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="550ab800-e31d-4c0d-8bbc-6538ce378e8e" containerName="gather" Nov 22 04:54:19 crc kubenswrapper[4699]: I1122 04:54:19.041991 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="550ab800-e31d-4c0d-8bbc-6538ce378e8e" containerName="gather" Nov 22 04:54:19 crc kubenswrapper[4699]: E1122 04:54:19.042012 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="550ab800-e31d-4c0d-8bbc-6538ce378e8e" containerName="copy" Nov 22 04:54:19 crc kubenswrapper[4699]: I1122 04:54:19.042018 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="550ab800-e31d-4c0d-8bbc-6538ce378e8e" containerName="copy" Nov 22 04:54:19 crc kubenswrapper[4699]: E1122 04:54:19.042031 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="388758ee-ef9f-42a3-8435-df26ceca3878" containerName="registry-server" Nov 22 04:54:19 crc kubenswrapper[4699]: I1122 04:54:19.042037 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="388758ee-ef9f-42a3-8435-df26ceca3878" containerName="registry-server" Nov 22 04:54:19 crc kubenswrapper[4699]: E1122 04:54:19.042046 4699 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="388758ee-ef9f-42a3-8435-df26ceca3878" containerName="extract-utilities" Nov 22 04:54:19 crc kubenswrapper[4699]: I1122 04:54:19.042052 4699 state_mem.go:107] "Deleted CPUSet assignment" podUID="388758ee-ef9f-42a3-8435-df26ceca3878" containerName="extract-utilities" Nov 22 04:54:19 crc kubenswrapper[4699]: I1122 04:54:19.042229 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="550ab800-e31d-4c0d-8bbc-6538ce378e8e" containerName="copy" Nov 22 04:54:19 crc kubenswrapper[4699]: I1122 04:54:19.042252 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="550ab800-e31d-4c0d-8bbc-6538ce378e8e" containerName="gather" Nov 22 04:54:19 crc kubenswrapper[4699]: I1122 04:54:19.042266 4699 memory_manager.go:354] "RemoveStaleState removing state" podUID="388758ee-ef9f-42a3-8435-df26ceca3878" containerName="registry-server" Nov 22 04:54:19 crc kubenswrapper[4699]: I1122 04:54:19.043693 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hkpjt" Nov 22 04:54:19 crc kubenswrapper[4699]: I1122 04:54:19.061216 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hkpjt"] Nov 22 04:54:19 crc kubenswrapper[4699]: I1122 04:54:19.140625 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ca8f73e-6c94-4aeb-ab17-bd3f26b2f0a0-catalog-content\") pod \"redhat-marketplace-hkpjt\" (UID: \"4ca8f73e-6c94-4aeb-ab17-bd3f26b2f0a0\") " pod="openshift-marketplace/redhat-marketplace-hkpjt" Nov 22 04:54:19 crc kubenswrapper[4699]: I1122 04:54:19.141144 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ca8f73e-6c94-4aeb-ab17-bd3f26b2f0a0-utilities\") pod \"redhat-marketplace-hkpjt\" (UID: \"4ca8f73e-6c94-4aeb-ab17-bd3f26b2f0a0\") " pod="openshift-marketplace/redhat-marketplace-hkpjt" Nov 22 04:54:19 crc kubenswrapper[4699]: I1122 04:54:19.141260 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68f4s\" (UniqueName: \"kubernetes.io/projected/4ca8f73e-6c94-4aeb-ab17-bd3f26b2f0a0-kube-api-access-68f4s\") pod \"redhat-marketplace-hkpjt\" (UID: \"4ca8f73e-6c94-4aeb-ab17-bd3f26b2f0a0\") " pod="openshift-marketplace/redhat-marketplace-hkpjt" Nov 22 04:54:19 crc kubenswrapper[4699]: I1122 04:54:19.243941 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ca8f73e-6c94-4aeb-ab17-bd3f26b2f0a0-utilities\") pod \"redhat-marketplace-hkpjt\" (UID: \"4ca8f73e-6c94-4aeb-ab17-bd3f26b2f0a0\") " pod="openshift-marketplace/redhat-marketplace-hkpjt" Nov 22 04:54:19 crc kubenswrapper[4699]: I1122 04:54:19.243997 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68f4s\" (UniqueName: \"kubernetes.io/projected/4ca8f73e-6c94-4aeb-ab17-bd3f26b2f0a0-kube-api-access-68f4s\") pod \"redhat-marketplace-hkpjt\" (UID: \"4ca8f73e-6c94-4aeb-ab17-bd3f26b2f0a0\") " pod="openshift-marketplace/redhat-marketplace-hkpjt" Nov 22 04:54:19 crc kubenswrapper[4699]: I1122 04:54:19.244073 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ca8f73e-6c94-4aeb-ab17-bd3f26b2f0a0-catalog-content\") pod \"redhat-marketplace-hkpjt\" (UID: \"4ca8f73e-6c94-4aeb-ab17-bd3f26b2f0a0\") " pod="openshift-marketplace/redhat-marketplace-hkpjt" Nov 22 04:54:19 crc kubenswrapper[4699]: I1122 04:54:19.244665 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ca8f73e-6c94-4aeb-ab17-bd3f26b2f0a0-utilities\") pod \"redhat-marketplace-hkpjt\" (UID: \"4ca8f73e-6c94-4aeb-ab17-bd3f26b2f0a0\") " pod="openshift-marketplace/redhat-marketplace-hkpjt" Nov 22 04:54:19 crc kubenswrapper[4699]: I1122 04:54:19.244705 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ca8f73e-6c94-4aeb-ab17-bd3f26b2f0a0-catalog-content\") pod \"redhat-marketplace-hkpjt\" (UID: \"4ca8f73e-6c94-4aeb-ab17-bd3f26b2f0a0\") " pod="openshift-marketplace/redhat-marketplace-hkpjt" Nov 22 04:54:19 crc kubenswrapper[4699]: I1122 04:54:19.274702 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68f4s\" (UniqueName: \"kubernetes.io/projected/4ca8f73e-6c94-4aeb-ab17-bd3f26b2f0a0-kube-api-access-68f4s\") pod \"redhat-marketplace-hkpjt\" (UID: \"4ca8f73e-6c94-4aeb-ab17-bd3f26b2f0a0\") " pod="openshift-marketplace/redhat-marketplace-hkpjt" Nov 22 04:54:19 crc kubenswrapper[4699]: I1122 04:54:19.366720 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hkpjt" Nov 22 04:54:19 crc kubenswrapper[4699]: I1122 04:54:19.893958 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hkpjt"] Nov 22 04:54:20 crc kubenswrapper[4699]: I1122 04:54:20.665354 4699 generic.go:334] "Generic (PLEG): container finished" podID="4ca8f73e-6c94-4aeb-ab17-bd3f26b2f0a0" containerID="1fbde46d4f76f0810bf8b88737d9b22ff3ea32c73419373125937091929745c9" exitCode=0 Nov 22 04:54:20 crc kubenswrapper[4699]: I1122 04:54:20.665411 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hkpjt" event={"ID":"4ca8f73e-6c94-4aeb-ab17-bd3f26b2f0a0","Type":"ContainerDied","Data":"1fbde46d4f76f0810bf8b88737d9b22ff3ea32c73419373125937091929745c9"} Nov 22 04:54:20 crc kubenswrapper[4699]: I1122 04:54:20.665462 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hkpjt" event={"ID":"4ca8f73e-6c94-4aeb-ab17-bd3f26b2f0a0","Type":"ContainerStarted","Data":"670bc3d82867341bfa5c67fe2b865d13e1b2f22dcea6ab0ec63fa8e43be89a21"} Nov 22 04:54:28 crc kubenswrapper[4699]: I1122 04:54:28.430360 4699 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-p4xzm"] Nov 22 04:54:28 crc kubenswrapper[4699]: I1122 04:54:28.432851 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p4xzm" Nov 22 04:54:28 crc kubenswrapper[4699]: I1122 04:54:28.440785 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p4xzm"] Nov 22 04:54:28 crc kubenswrapper[4699]: I1122 04:54:28.518346 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c46775f3-68b1-4083-9c11-89aa681b88cf-catalog-content\") pod \"community-operators-p4xzm\" (UID: \"c46775f3-68b1-4083-9c11-89aa681b88cf\") " pod="openshift-marketplace/community-operators-p4xzm" Nov 22 04:54:28 crc kubenswrapper[4699]: I1122 04:54:28.518449 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cclvc\" (UniqueName: \"kubernetes.io/projected/c46775f3-68b1-4083-9c11-89aa681b88cf-kube-api-access-cclvc\") pod \"community-operators-p4xzm\" (UID: \"c46775f3-68b1-4083-9c11-89aa681b88cf\") " pod="openshift-marketplace/community-operators-p4xzm" Nov 22 04:54:28 crc kubenswrapper[4699]: I1122 04:54:28.518787 4699 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c46775f3-68b1-4083-9c11-89aa681b88cf-utilities\") pod \"community-operators-p4xzm\" (UID: \"c46775f3-68b1-4083-9c11-89aa681b88cf\") " pod="openshift-marketplace/community-operators-p4xzm" Nov 22 04:54:28 crc kubenswrapper[4699]: I1122 04:54:28.621122 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c46775f3-68b1-4083-9c11-89aa681b88cf-utilities\") pod \"community-operators-p4xzm\" (UID: \"c46775f3-68b1-4083-9c11-89aa681b88cf\") " pod="openshift-marketplace/community-operators-p4xzm" Nov 22 04:54:28 crc kubenswrapper[4699]: I1122 04:54:28.621291 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c46775f3-68b1-4083-9c11-89aa681b88cf-catalog-content\") pod \"community-operators-p4xzm\" (UID: \"c46775f3-68b1-4083-9c11-89aa681b88cf\") " pod="openshift-marketplace/community-operators-p4xzm" Nov 22 04:54:28 crc kubenswrapper[4699]: I1122 04:54:28.621712 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c46775f3-68b1-4083-9c11-89aa681b88cf-utilities\") pod \"community-operators-p4xzm\" (UID: \"c46775f3-68b1-4083-9c11-89aa681b88cf\") " pod="openshift-marketplace/community-operators-p4xzm" Nov 22 04:54:28 crc kubenswrapper[4699]: I1122 04:54:28.621790 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c46775f3-68b1-4083-9c11-89aa681b88cf-catalog-content\") pod \"community-operators-p4xzm\" (UID: \"c46775f3-68b1-4083-9c11-89aa681b88cf\") " pod="openshift-marketplace/community-operators-p4xzm" Nov 22 04:54:28 crc kubenswrapper[4699]: I1122 04:54:28.621925 4699 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cclvc\" (UniqueName: \"kubernetes.io/projected/c46775f3-68b1-4083-9c11-89aa681b88cf-kube-api-access-cclvc\") pod \"community-operators-p4xzm\" (UID: \"c46775f3-68b1-4083-9c11-89aa681b88cf\") " pod="openshift-marketplace/community-operators-p4xzm" Nov 22 04:54:28 crc kubenswrapper[4699]: I1122 04:54:28.642343 4699 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cclvc\" (UniqueName: \"kubernetes.io/projected/c46775f3-68b1-4083-9c11-89aa681b88cf-kube-api-access-cclvc\") pod \"community-operators-p4xzm\" (UID: \"c46775f3-68b1-4083-9c11-89aa681b88cf\") " pod="openshift-marketplace/community-operators-p4xzm" Nov 22 04:54:28 crc kubenswrapper[4699]: I1122 04:54:28.777936 4699 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p4xzm" Nov 22 04:54:29 crc kubenswrapper[4699]: I1122 04:54:29.274143 4699 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p4xzm"] Nov 22 04:54:29 crc kubenswrapper[4699]: W1122 04:54:29.283928 4699 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc46775f3_68b1_4083_9c11_89aa681b88cf.slice/crio-b63ab78878bbf0f92dc978debd9e4f6275876c2e789e247789750a7c5fad5333 WatchSource:0}: Error finding container b63ab78878bbf0f92dc978debd9e4f6275876c2e789e247789750a7c5fad5333: Status 404 returned error can't find the container with id b63ab78878bbf0f92dc978debd9e4f6275876c2e789e247789750a7c5fad5333 Nov 22 04:54:29 crc kubenswrapper[4699]: I1122 04:54:29.733259 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p4xzm" event={"ID":"c46775f3-68b1-4083-9c11-89aa681b88cf","Type":"ContainerStarted","Data":"b63ab78878bbf0f92dc978debd9e4f6275876c2e789e247789750a7c5fad5333"} Nov 22 04:54:30 crc kubenswrapper[4699]: I1122 04:54:30.742012 4699 generic.go:334] "Generic (PLEG): container finished" podID="c46775f3-68b1-4083-9c11-89aa681b88cf" containerID="bf48d8c5e8035df2498f4b2a8808546c9e0512223386ea22adc74113a8ac6c37" exitCode=0 Nov 22 04:54:30 crc kubenswrapper[4699]: I1122 04:54:30.742113 4699 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p4xzm" event={"ID":"c46775f3-68b1-4083-9c11-89aa681b88cf","Type":"ContainerDied","Data":"bf48d8c5e8035df2498f4b2a8808546c9e0512223386ea22adc74113a8ac6c37"}